Commit
·
136f9cf
1
Parent(s):
69269d5
feat: add Hugging Face and Clerk integrations with enhanced configuration
Browse files- .gitignore +1 -1
- README.md +91 -0
- bun.lock +41 -47
- package.json +2 -0
- src/App.tsx +27 -19
- src/components/sidebar/app-sidebar.tsx +5 -4
- src/components/ui/loading-screen.tsx +14 -0
- src/components/ui/spinner.tsx +10 -0
- src/contexts/loading-context.tsx +45 -0
- src/hooks/use-chat.ts +89 -0
- src/lib/chat/chat-hf.ts +23 -0
- src/lib/chat/manager.ts +152 -56
- src/lib/chat/memory.ts +14 -5
- src/lib/chat/types.ts +31 -0
- src/lib/config/manager.ts +126 -22
- src/lib/config/types.ts +194 -26
- src/lib/document/manager.ts +22 -2
- src/pages/chat/components/AIMessage.tsx +1 -1
- src/pages/chat/components/ModelSelector.tsx +2 -2
- src/pages/chat/hooks.ts +3 -73
- src/pages/chat/page.tsx +199 -56
- src/pages/home/components/ChatModels.tsx +19 -0
- src/pages/home/components/EmbeddingModels.tsx +19 -0
- src/pages/home/components/ModelList.tsx +165 -0
- src/pages/home/components/Others.tsx +25 -0
- src/pages/home/components/Providers.tsx +262 -0
- src/pages/home/hooks.ts +5 -0
- src/pages/home/page.tsx +22 -265
- src/pages/home/types.ts +27 -0
- src/pages/integrations/huggingface-callback.tsx +103 -0
- src/polyfills.ts +1 -1
- src/routes.tsx +2 -2
- test/Dockerfile +137 -0
- test/jupyter-wasm-launcher.html +85 -0
- test/start.sh +115 -0
.gitignore
CHANGED
@@ -22,5 +22,5 @@ dist-ssr
|
|
22 |
*.njsproj
|
23 |
*.sln
|
24 |
*.sw?
|
25 |
-
.env
|
26 |
package-lock.json
|
|
|
22 |
*.njsproj
|
23 |
*.sln
|
24 |
*.sw?
|
25 |
+
.env*
|
26 |
package-lock.json
|
README.md
CHANGED
@@ -84,3 +84,94 @@ react-vite-ui/
|
|
84 |
## 📄 License
|
85 |
|
86 |
This project is licensed under the MIT License. See the [LICENSE](https://choosealicense.com/licenses/mit/) file for details.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
84 |
## 📄 License
|
85 |
|
86 |
This project is licensed under the MIT License. See the [LICENSE](https://choosealicense.com/licenses/mit/) file for details.
|
87 |
+
|
88 |
+
# AI Platform with Integrations
|
89 |
+
|
90 |
+
This platform allows you to configure and use various AI providers and models, with integrations for Hugging Face and Google Drive.
|
91 |
+
|
92 |
+
## Setting Up Integrations
|
93 |
+
|
94 |
+
### Hugging Face Integration
|
95 |
+
|
96 |
+
To set up the Hugging Face integration:
|
97 |
+
|
98 |
+
1. Go to [Hugging Face](https://huggingface.co/) and create an account if you don't have one.
|
99 |
+
2. Navigate to your profile settings and go to "Access Tokens".
|
100 |
+
3. Create a new OAuth application:
|
101 |
+
- Name: Your App Name
|
102 |
+
- Redirect URI: `https://your-domain.com/integrations/huggingface-callback`
|
103 |
+
- Scopes: `inference-api`
|
104 |
+
4. Copy the Client ID and Client Secret.
|
105 |
+
5. Update the environment variables in your application:
|
106 |
+
```
|
107 |
+
OAUTH_CLIENT_ID=your_client_id
|
108 |
+
OAUTH_CLIENT_SECRET=your_client_secret
|
109 |
+
```
|
110 |
+
|
111 |
+
### Google Drive Integration
|
112 |
+
|
113 |
+
To set up the Google Drive integration:
|
114 |
+
|
115 |
+
1. Go to the [Google Cloud Console](https://console.cloud.google.com/).
|
116 |
+
2. Create a new project or select an existing one.
|
117 |
+
3. Navigate to "APIs & Services" > "Credentials".
|
118 |
+
4. Click "Create Credentials" > "OAuth client ID".
|
119 |
+
5. Configure the OAuth consent screen:
|
120 |
+
- User Type: External
|
121 |
+
- App name: Your App Name
|
122 |
+
- User support email: Your email
|
123 |
+
- Developer contact information: Your email
|
124 |
+
6. Add the following scopes:
|
125 |
+
- `https://www.googleapis.com/auth/drive.file`
|
126 |
+
7. Create the OAuth client ID:
|
127 |
+
- Application type: Web application
|
128 |
+
- Name: Your App Name
|
129 |
+
- Authorized JavaScript origins: `https://your-domain.com`
|
130 |
+
- Authorized redirect URIs: `https://your-domain.com/integrations/google-callback`
|
131 |
+
8. Copy the Client ID and Client Secret.
|
132 |
+
9. Update the environment variables in your application:
|
133 |
+
```
|
134 |
+
GOOGLE_CLIENT_ID=your_client_id
|
135 |
+
GOOGLE_CLIENT_SECRET=your_client_secret
|
136 |
+
```
|
137 |
+
|
138 |
+
## Using the Integrations
|
139 |
+
|
140 |
+
### Hugging Face
|
141 |
+
|
142 |
+
Once authenticated, you can use the Hugging Face Inference API to access models hosted on Hugging Face. The integration provides the following functionality:
|
143 |
+
|
144 |
+
- Call the Inference API with any model
|
145 |
+
- Access to all public and your private models
|
146 |
+
|
147 |
+
### Google Drive
|
148 |
+
|
149 |
+
Once authenticated, you can use the Google Drive API to read and write files. The integration provides the following functionality:
|
150 |
+
|
151 |
+
- List files in your Google Drive
|
152 |
+
- Upload files to your Google Drive
|
153 |
+
- Download files from your Google Drive
|
154 |
+
|
155 |
+
## Development
|
156 |
+
|
157 |
+
To run the application locally:
|
158 |
+
|
159 |
+
1. Clone the repository
|
160 |
+
2. Install dependencies: `npm install`
|
161 |
+
3. Start the development server: `npm run dev`
|
162 |
+
4. Open [http://localhost:3000](http://localhost:3000) in your browser
|
163 |
+
|
164 |
+
## Environment Variables
|
165 |
+
|
166 |
+
Create a `.env` file in the root directory with the following variables:
|
167 |
+
|
168 |
+
```
|
169 |
+
OAUTH_CLIENT_ID=your_huggingface_client_id
|
170 |
+
OAUTH_CLIENT_SECRET=your_huggingface_client_secret
|
171 |
+
GOOGLE_CLIENT_ID=your_google_client_id
|
172 |
+
GOOGLE_CLIENT_SECRET=your_google_client_secret
|
173 |
+
```
|
174 |
+
|
175 |
+
## License
|
176 |
+
|
177 |
+
MIT
|
bun.lock
CHANGED
@@ -4,7 +4,9 @@
|
|
4 |
"": {
|
5 |
"name": "react-vite-ui",
|
6 |
"dependencies": {
|
|
|
7 |
"@hookform/resolvers": "^4.1.1",
|
|
|
8 |
"@langchain/anthropic": "^0.3.13",
|
9 |
"@langchain/community": "^0.3.32",
|
10 |
"@langchain/core": "^0.3.40",
|
@@ -42,7 +44,7 @@
|
|
42 |
"buffer": "^6.0.3",
|
43 |
"class-variance-authority": "^0.7.1",
|
44 |
"clsx": "^2.1.1",
|
45 |
-
"cmdk": "1.0.
|
46 |
"date-fns": "^4.1.0",
|
47 |
"dexie": "^4.0.11",
|
48 |
"dexie-react-hooks": "^1.1.7",
|
@@ -153,6 +155,12 @@
|
|
153 |
|
154 |
"@cfworker/json-schema": ["@cfworker/[email protected]", "", {}, "sha512-gAmrUZSGtKc3AiBL71iNWxDsyUC5uMaKKGdvzYsBoTW/xi42JQHl7eKV2OYzCUqvc+D2RCcf7EXY2iCyFIk6og=="],
|
155 |
|
|
|
|
|
|
|
|
|
|
|
|
|
156 |
"@esbuild/aix-ppc64": ["@esbuild/[email protected]", "", { "os": "aix", "cpu": "ppc64" }, "sha512-thpVCb/rhxE/BnMLQ7GReQLLN8q9qbHmI55F4489/ByVg2aQaQ6kbcLb6FHkocZzQhxc4gx0sCk0tJkKBFzDhA=="],
|
157 |
|
158 |
"@esbuild/android-arm": ["@esbuild/[email protected]", "", { "os": "android", "cpu": "arm" }, "sha512-tmwl4hJkCfNHwFB3nBa8z1Uy3ypZpxqxfTQOcHX+xRByyYgunVbZ9MzUUfb0RxaHIMnbHagwAxuTL+tnNM+1/Q=="],
|
@@ -233,6 +241,10 @@
|
|
233 |
|
234 |
"@hookform/resolvers": ["@hookform/[email protected]", "", { "dependencies": { "caniuse-lite": "^1.0.30001698" }, "peerDependencies": { "react-hook-form": "^7.0.0" } }, "sha512-S9YN1RgNWG+klUz5uQaV6rjE4pr6Py2tamj7ekshzLcMyg+/Pal1KZAYgGszV0+doiy41dUiQgXL3uRS9stndQ=="],
|
235 |
|
|
|
|
|
|
|
|
|
236 |
"@humanfs/core": ["@humanfs/[email protected]", "", {}, "sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA=="],
|
237 |
|
238 |
"@humanfs/node": ["@humanfs/[email protected]", "", { "dependencies": { "@humanfs/core": "^0.19.1", "@humanwhocodes/retry": "^0.3.0" } }, "sha512-YuI2ZHQL78Q5HbhDiBA1X4LmYdXCKCMQIfw0pw7piHJwyREFebJUvrQN4cMssyES6x+vfUbx1CIpaQUKYdQZOw=="],
|
@@ -625,7 +637,7 @@
|
|
625 |
|
626 |
"clsx": ["[email protected]", "", {}, "sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA=="],
|
627 |
|
628 |
-
"cmdk": ["[email protected].
|
629 |
|
630 |
"color-convert": ["[email protected]", "", { "dependencies": { "color-name": "~1.1.4" } }, "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ=="],
|
631 |
|
@@ -855,6 +867,8 @@
|
|
855 |
|
856 |
"glob-parent": ["[email protected]", "", { "dependencies": { "is-glob": "^4.0.3" } }, "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A=="],
|
857 |
|
|
|
|
|
858 |
"globals": ["[email protected]", "", {}, "sha512-7ACyT3wmyp3I61S4fG682L0VA2RGD9otkqGJIwNUMF1SWUombIIk+af1unuDYgMm082aHYwD+mzJvv9Iu8dsgg=="],
|
859 |
|
860 |
"globby": ["[email protected]", "", { "dependencies": { "array-union": "^2.1.0", "dir-glob": "^3.0.1", "fast-glob": "^3.2.9", "ignore": "^5.2.0", "merge2": "^1.4.1", "slash": "^3.0.0" } }, "sha512-jhIXaOzy1sb8IyocaruWSn1TjmnBVs8Ayhcy83rmxNJ8q2uWKCAj3CnJY+KpGSXCueAPc0i05kVvVKtP1t9S3g=="],
|
@@ -955,6 +969,8 @@
|
|
955 |
|
956 |
"jiti": ["[email protected]", "", { "bin": { "jiti": "bin/jiti.js" } }, "sha512-/imKNG4EbWNrVjoNC/1H5/9GFy+tqjGBHCaSsN+P2RnPqjsLmv6UD3Ej+Kj8nBWaRAwyk7kK5ZUc+OEatnTR3A=="],
|
957 |
|
|
|
|
|
958 |
"js-tiktoken": ["[email protected]", "", { "dependencies": { "base64-js": "^1.5.1" } }, "sha512-XC63YQeEcS47Y53gg950xiZ4IWmkfMe4p2V9OSaBt26q+p47WHn18izuXzSclCI73B7yGqtfRsT6jcZQI0y08g=="],
|
959 |
|
960 |
"js-tokens": ["[email protected]", "", {}, "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="],
|
@@ -1383,6 +1399,8 @@
|
|
1383 |
|
1384 |
"sprintf-js": ["[email protected]", "", {}, "sha512-D9cPgkvLlV3t3IzL0D0YLvGA9Ahk4PcvVwUbN0dSGr1aP0Nrt4AEnTUbuGvquEC0mA64Gqt1fzirlRs5ibXx8g=="],
|
1385 |
|
|
|
|
|
1386 |
"string-width": ["[email protected]", "", { "dependencies": { "eastasianwidth": "^0.2.0", "emoji-regex": "^9.2.2", "strip-ansi": "^7.0.1" } }, "sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA=="],
|
1387 |
|
1388 |
"string-width-cjs": ["[email protected]", "", { "dependencies": { "emoji-regex": "^8.0.0", "is-fullwidth-code-point": "^3.0.0", "strip-ansi": "^6.0.1" } }, "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g=="],
|
@@ -1411,6 +1429,8 @@
|
|
1411 |
|
1412 |
"supports-preserve-symlinks-flag": ["[email protected]", "", {}, "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w=="],
|
1413 |
|
|
|
|
|
1414 |
"tailwind-merge": ["[email protected]", "", {}, "sha512-l7z+OYZ7mu3DTqrL88RiKrKIqO3NcpEO8V/Od04bNpvk0kiIFndGEoqfuzvj4yuhRkHKjRkII2z+KS2HfPcSxw=="],
|
1415 |
|
1416 |
"tailwindcss": ["[email protected]", "", { "dependencies": { "@alloc/quick-lru": "^5.2.0", "arg": "^5.0.2", "chokidar": "^3.6.0", "didyoumean": "^1.2.2", "dlv": "^1.1.3", "fast-glob": "^3.3.2", "glob-parent": "^6.0.2", "is-glob": "^4.0.3", "jiti": "^1.21.6", "lilconfig": "^3.1.3", "micromatch": "^4.0.8", "normalize-path": "^3.0.0", "object-hash": "^3.0.0", "picocolors": "^1.1.1", "postcss": "^8.4.47", "postcss-import": "^15.1.0", "postcss-js": "^4.0.1", "postcss-load-config": "^4.0.2", "postcss-nested": "^6.2.0", "postcss-selector-parser": "^6.1.2", "resolve": "^1.22.8", "sucrase": "^3.35.0" }, "bin": { "tailwind": "lib/cli.js", "tailwindcss": "lib/cli.js" } }, "sha512-w33E2aCvSDP0tW9RZuNXadXlkHXqFzSkQew/aIa2i/Sj8fThxwovwlXHSPXTbAHwEIhBFXAedUhP2tueAKP8Og=="],
|
@@ -1441,7 +1461,7 @@
|
|
1441 |
|
1442 |
"ts-interface-checker": ["[email protected]", "", {}, "sha512-Y/arvbn+rrz3JCKl9C4kVNfTfSm2/mEp5FSz5EsZSANGPSlQrpRI5M4PKF+mJnE52jOO90PnPSc3Ur3bTQw0gA=="],
|
1443 |
|
1444 |
-
"tslib": ["tslib@2.
|
1445 |
|
1446 |
"turbo-stream": ["[email protected]", "", {}, "sha512-FHncC10WpBd2eOmGwpmQsWLDoK4cqsA/UT/GqNoaKOQnT8uzhtCbg3EoUDMvqpOSAI0S26mr0rkjzbOO6S3v1g=="],
|
1447 |
|
@@ -1485,6 +1505,8 @@
|
|
1485 |
|
1486 |
"use-sidecar": ["[email protected]", "", { "dependencies": { "detect-node-es": "^1.1.0", "tslib": "^2.0.0" }, "peerDependencies": { "@types/react": "*", "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-Fedw0aZvkhynoPYlA5WXrMCAMm+nSWdZt6lzJQ7Ok8S6Q+VsHmHpRWndVRJ8Be0ZbkfPc5LRYH+5XrzXcEeLRQ=="],
|
1487 |
|
|
|
|
|
1488 |
"util-deprecate": ["[email protected]", "", {}, "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw=="],
|
1489 |
|
1490 |
"uuid": ["[email protected]", "", { "bin": { "uuid": "dist/bin/uuid" } }, "sha512-8XkAphELsDnEGrDxUOHB3RGvXz6TeuYSGEZBOjtTtPm2lwhGBjLgOzLHB63IUWfBpNucQjND6d3AOudO+H3RWQ=="],
|
@@ -1567,6 +1589,8 @@
|
|
1567 |
|
1568 |
"@typescript-eslint/typescript-estree/semver": ["[email protected]", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-hlq8tAfn0m/61p4BVRcPzIGr6LKiMwo4VM6dGi6pt4qcRkmNzTcWq6eCEjEh+qXjkMDvPlOFFSGwQjoEa6gyMA=="],
|
1569 |
|
|
|
|
|
1570 |
"axios/form-data": ["[email protected]", "", { "dependencies": { "asynckit": "^0.4.0", "combined-stream": "^1.0.8", "es-set-tostringtag": "^2.1.0", "mime-types": "^2.1.12" } }, "sha512-hGfm/slu0ZabnNt4oaRZ6uREyfCj6P4fT/n6A1rGV+Z0VdGXjfOhVUpkn6qVQONHGIFwmveGXyDs75+nr6FM8w=="],
|
1571 |
|
1572 |
"browserfs/async": ["[email protected]", "", { "dependencies": { "lodash": "^4.17.14" } }, "sha512-mzo5dfJYwAn29PeiJ0zvwTo04zj8HDJj0Mn8TD7sno7q12prdbnasKJHhkm2c1LgrhlJ0teaea8860oxi51mGA=="],
|
@@ -1575,10 +1599,6 @@
|
|
1575 |
|
1576 |
"chokidar/glob-parent": ["[email protected]", "", { "dependencies": { "is-glob": "^4.0.1" } }, "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow=="],
|
1577 |
|
1578 |
-
"cmdk/@radix-ui/react-dialog": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/primitive": "1.0.1", "@radix-ui/react-compose-refs": "1.0.1", "@radix-ui/react-context": "1.0.1", "@radix-ui/react-dismissable-layer": "1.0.5", "@radix-ui/react-focus-guards": "1.0.1", "@radix-ui/react-focus-scope": "1.0.4", "@radix-ui/react-id": "1.0.1", "@radix-ui/react-portal": "1.0.4", "@radix-ui/react-presence": "1.0.1", "@radix-ui/react-primitive": "1.0.3", "@radix-ui/react-slot": "1.0.2", "@radix-ui/react-use-controllable-state": "1.0.1", "aria-hidden": "^1.1.1", "react-remove-scroll": "2.5.5" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0", "react-dom": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-GjWJX/AUpB703eEBanuBnIWdIXg6NvJFCXcNlSZk4xdszCdhrJgBoUd1cGk67vFO+WdA2pfI/plOpqz/5GUP6Q=="],
|
1579 |
-
|
1580 |
-
"cmdk/@radix-ui/react-primitive": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-slot": "1.0.2" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0", "react-dom": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-yi58uVyoAcK/Nq1inRY56ZSjKypBNKTa/1mcL8qdl6oJeEaDbOldlzrGn7P6Q3Id5d+SYNGc5AJgc4vGhjs5+g=="],
|
1581 |
-
|
1582 |
"decode-named-character-reference/character-entities": ["[email protected]", "", {}, "sha512-shx7oQ0Awen/BRIdkjkvz54PnEEI/EjwXDSIZp86/KKdbafHh1Df/RYGBhn4hbe2+uKC9FnT5UCEdyPz3ai9hQ=="],
|
1583 |
|
1584 |
"fast-glob/glob-parent": ["[email protected]", "", { "dependencies": { "is-glob": "^4.0.1" } }, "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow=="],
|
@@ -1623,12 +1643,20 @@
|
|
1623 |
|
1624 |
"prop-types/react-is": ["[email protected]", "", {}, "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ=="],
|
1625 |
|
|
|
|
|
|
|
|
|
|
|
|
|
1626 |
"readable-stream/safe-buffer": ["[email protected]", "", {}, "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="],
|
1627 |
|
1628 |
"readable-web-to-node-stream/readable-stream": ["[email protected]", "", { "dependencies": { "abort-controller": "^3.0.0", "buffer": "^6.0.3", "events": "^3.3.0", "process": "^0.11.10", "string_decoder": "^1.3.0" } }, "sha512-oIGGmcpTLwPga8Bn6/Z75SVaH1z5dUut2ibSyAMVhmUggWpmDn2dapB0n7f8nwaSiRtepAsfJyfXIO5DCVAODg=="],
|
1629 |
|
1630 |
"refractor/prismjs": ["[email protected]", "", {}, "sha512-t13BGPUlFDR7wRB5kQDG4jjl7XeuH6jbJGt11JHPL96qwsEHNX2+68tFXqc1/k+/jALsbSWJKUOT/hcYAZ5LkA=="],
|
1631 |
|
|
|
|
|
1632 |
"string-width-cjs/emoji-regex": ["[email protected]", "", {}, "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="],
|
1633 |
|
1634 |
"string-width-cjs/strip-ansi": ["[email protected]", "", { "dependencies": { "ansi-regex": "^5.0.1" } }, "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A=="],
|
@@ -1647,6 +1675,10 @@
|
|
1647 |
|
1648 |
"trim-repeated/escape-string-regexp": ["[email protected]", "", {}, "sha512-vbRorB5FUQWvla16U8R/qgaFIya2qGzwDrNmCZuYKrbdSUMG6I1ZCGQRefkRVhuOkIGVne7BQ35DSfo1qvJqFg=="],
|
1649 |
|
|
|
|
|
|
|
|
|
1650 |
"wrap-ansi/ansi-styles": ["[email protected]", "", {}, "sha512-bN798gFfQX+viw3R7yrGWRqnrN2oRkEkUjjl4JNn4E8GxxbjtG3FbrEIIY3l8/hrwUwIeCZvi4QuOTP4MErVug=="],
|
1651 |
|
1652 |
"wrap-ansi-cjs/ansi-styles": ["[email protected]", "", { "dependencies": { "color-convert": "^2.0.1" } }, "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg=="],
|
@@ -1655,6 +1687,8 @@
|
|
1655 |
|
1656 |
"wrap-ansi-cjs/strip-ansi": ["[email protected]", "", { "dependencies": { "ansi-regex": "^5.0.1" } }, "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A=="],
|
1657 |
|
|
|
|
|
1658 |
"@anthropic-ai/sdk/@types/node/undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
|
1659 |
|
1660 |
"@browserbasehq/sdk/@types/node/undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
|
@@ -1665,32 +1699,6 @@
|
|
1665 |
|
1666 |
"@typescript-eslint/typescript-estree/minimatch/brace-expansion": ["[email protected]", "", { "dependencies": { "balanced-match": "^1.0.0" } }, "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA=="],
|
1667 |
|
1668 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/primitive": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" } }, "sha512-yQ8oGX2GVsEYMWGxcovu1uGWPCxV5BFfeeYxqPmuAzUyLT9qmaMXSAhXpb0WrspIeqYzdJpkh2vHModJPgRIaw=="],
|
1669 |
-
|
1670 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-compose-refs": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-fDSBgd44FKHa1FRMU59qBMPFcl2PZE+2nmqunj+BWFyYYjnhIDWL2ItDs3rrbJDQOtzt5nIebLCQc4QRfz6LJw=="],
|
1671 |
-
|
1672 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-context": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-ebbrdFoYTcuZ0v4wG5tedGnp9tzcV8awzsxYph7gXUyvnNLuTIcCk1q17JEbnVhXAKG9oX3KtchwiMIAYp9NLg=="],
|
1673 |
-
|
1674 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-dismissable-layer": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/primitive": "1.0.1", "@radix-ui/react-compose-refs": "1.0.1", "@radix-ui/react-primitive": "1.0.3", "@radix-ui/react-use-callback-ref": "1.0.1", "@radix-ui/react-use-escape-keydown": "1.0.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0", "react-dom": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-aJeDjQhywg9LBu2t/At58hCvr7pEm0o2Ke1x33B+MhjNmmZ17sy4KImo0KPLgsnc/zN7GPdce8Cnn0SWvwZO7g=="],
|
1675 |
-
|
1676 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-focus-guards": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-Rect2dWbQ8waGzhMavsIbmSVCgYxkXLxxR3ZvCX79JOglzdEy4JXMb98lq4hPxUbLr77nP0UOGf4rcMU+s1pUA=="],
|
1677 |
-
|
1678 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-focus-scope": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-compose-refs": "1.0.1", "@radix-ui/react-primitive": "1.0.3", "@radix-ui/react-use-callback-ref": "1.0.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0", "react-dom": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-sL04Mgvf+FmyvZeYfNu1EPAaaxD+aw7cYeIB9L9Fvq8+urhltTRaEo5ysKOpHuKPclsZcSUMKlN05x4u+CINpA=="],
|
1679 |
-
|
1680 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-id": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-use-layout-effect": "1.0.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-tI7sT/kqYp8p96yGWY1OAnLHrqDgzHefRBKQ2YAkBS5ja7QLcZ9Z/uY7bEjPUatf8RomoXM8/1sMj1IJaE5UzQ=="],
|
1681 |
-
|
1682 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-portal": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-primitive": "1.0.3" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0", "react-dom": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-Qki+C/EuGUVCQTOTD5vzJzJuMUlewbzuKyUy+/iHM2uwGiru9gZeBJtHAPKAEkB5KWGi9mP/CHKcY0wt1aW45Q=="],
|
1683 |
-
|
1684 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-presence": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-compose-refs": "1.0.1", "@radix-ui/react-use-layout-effect": "1.0.1" }, "peerDependencies": { "@types/react": "*", "@types/react-dom": "*", "react": "^16.8 || ^17.0 || ^18.0", "react-dom": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react", "@types/react-dom"] }, "sha512-UXLW4UAbIY5ZjcvzjfRFo5gxva8QirC9hF7wRE4U5gz+TP0DbRk+//qyuAQ1McDxBt1xNMBTaciFGvEmJvAZCg=="],
|
1685 |
-
|
1686 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-slot": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-compose-refs": "1.0.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-YeTpuq4deV+6DusvVUW4ivBgnkHwECUu0BiN43L5UCDFgdhsRUWAghhTF5MbvNTPzmiFOx90asDSUjWuCNapwg=="],
|
1687 |
-
|
1688 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-use-controllable-state": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-use-callback-ref": "1.0.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-Svl5GY5FQeN758fWKrjM6Qb7asvXeiZltlT4U2gVfl8Gx5UAv2sMR0LWo8yhsIZh2oQ0eFdZ59aoOOMV7b47VA=="],
|
1689 |
-
|
1690 |
-
"cmdk/@radix-ui/react-dialog/react-remove-scroll": ["[email protected]", "", { "dependencies": { "react-remove-scroll-bar": "^2.3.3", "react-style-singleton": "^2.2.1", "tslib": "^2.1.0", "use-callback-ref": "^1.3.0", "use-sidecar": "^1.1.2" }, "peerDependencies": { "@types/react": "^16.8.0 || ^17.0.0 || ^18.0.0", "react": "^16.8.0 || ^17.0.0 || ^18.0.0" }, "optionalPeers": ["@types/react"] }, "sha512-ImKhrzJJsyXJfBZ4bzu8Bwpka14c/fQt0k+cyFp/PBhTfyDnU5hjOtM4AG/0AMyy8oKzOTR0lDgJIM7pYXI0kw=="],
|
1691 |
-
|
1692 |
-
"cmdk/@radix-ui/react-primitive/@radix-ui/react-slot": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-compose-refs": "1.0.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-YeTpuq4deV+6DusvVUW4ivBgnkHwECUu0BiN43L5UCDFgdhsRUWAghhTF5MbvNTPzmiFOx90asDSUjWuCNapwg=="],
|
1693 |
-
|
1694 |
"glob/minimatch/brace-expansion": ["[email protected]", "", { "dependencies": { "balanced-match": "^1.0.0" } }, "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA=="],
|
1695 |
|
1696 |
"hast-util-from-dom/hastscript/hast-util-parse-selector": ["[email protected]", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-wkQCkSYoOGCRKERFWcxMVMOcYE2K1AaNLU8DXS9arxnLOUEWbOXKXiJUNzEpqZ3JOKpnha3jkFrumEjVliDe7A=="],
|
@@ -1725,20 +1733,6 @@
|
|
1725 |
|
1726 |
"@browserbasehq/stagehand/@anthropic-ai/sdk/@types/node/undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
|
1727 |
|
1728 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-dismissable-layer/@radix-ui/react-use-callback-ref": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-D94LjX4Sp0xJFVaoQOd3OO9k7tpBYNOXdVhkltUbGv2Qb9OXdrg/CpsjlZv7ia14Sylv398LswWBVVu5nqKzAQ=="],
|
1729 |
-
|
1730 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-dismissable-layer/@radix-ui/react-use-escape-keydown": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10", "@radix-ui/react-use-callback-ref": "1.0.1" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-vyL82j40hcFicA+M4Ex7hVkB9vHgSse1ZWomAqV2Je3RleKGO5iM8KMOEtfoSB0PnIelMd2lATjTGMYqN5ylTg=="],
|
1731 |
-
|
1732 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-focus-scope/@radix-ui/react-use-callback-ref": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-D94LjX4Sp0xJFVaoQOd3OO9k7tpBYNOXdVhkltUbGv2Qb9OXdrg/CpsjlZv7ia14Sylv398LswWBVVu5nqKzAQ=="],
|
1733 |
-
|
1734 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-id/@radix-ui/react-use-layout-effect": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-v/5RegiJWYdoCvMnITBkNNx6bCj20fiaJnWtRkU18yITptraXjffz5Qbn05uOiQnOvi+dbkznkoaMltz1GnszQ=="],
|
1735 |
-
|
1736 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-presence/@radix-ui/react-use-layout-effect": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-v/5RegiJWYdoCvMnITBkNNx6bCj20fiaJnWtRkU18yITptraXjffz5Qbn05uOiQnOvi+dbkznkoaMltz1GnszQ=="],
|
1737 |
-
|
1738 |
-
"cmdk/@radix-ui/react-dialog/@radix-ui/react-use-controllable-state/@radix-ui/react-use-callback-ref": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-D94LjX4Sp0xJFVaoQOd3OO9k7tpBYNOXdVhkltUbGv2Qb9OXdrg/CpsjlZv7ia14Sylv398LswWBVVu5nqKzAQ=="],
|
1739 |
-
|
1740 |
-
"cmdk/@radix-ui/react-primitive/@radix-ui/react-slot/@radix-ui/react-compose-refs": ["@radix-ui/[email protected]", "", { "dependencies": { "@babel/runtime": "^7.13.10" }, "peerDependencies": { "@types/react": "*", "react": "^16.8 || ^17.0 || ^18.0" }, "optionalPeers": ["@types/react"] }, "sha512-fDSBgd44FKHa1FRMU59qBMPFcl2PZE+2nmqunj+BWFyYYjnhIDWL2ItDs3rrbJDQOtzt5nIebLCQc4QRfz6LJw=="],
|
1741 |
-
|
1742 |
"mdast-util-mdx-jsx/parse-entities/is-alphanumerical/is-alphabetical": ["[email protected]", "", {}, "sha512-FWyyY60MeTNyeSRpkM2Iry0G9hpr7/9kD40mD/cGQEuilcZYS4okz8SN2Q6rLCJ8gbCt6fN+rC+6tMGS99LaxQ=="],
|
1743 |
|
1744 |
"pkg-dir/find-up/locate-path/p-locate": ["[email protected]", "", { "dependencies": { "p-limit": "^2.2.0" } }, "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A=="],
|
|
|
4 |
"": {
|
5 |
"name": "react-vite-ui",
|
6 |
"dependencies": {
|
7 |
+
"@clerk/clerk-react": "^5.23.0",
|
8 |
"@hookform/resolvers": "^4.1.1",
|
9 |
+
"@huggingface/inference": "2",
|
10 |
"@langchain/anthropic": "^0.3.13",
|
11 |
"@langchain/community": "^0.3.32",
|
12 |
"@langchain/core": "^0.3.40",
|
|
|
44 |
"buffer": "^6.0.3",
|
45 |
"class-variance-authority": "^0.7.1",
|
46 |
"clsx": "^2.1.1",
|
47 |
+
"cmdk": "^1.0.4",
|
48 |
"date-fns": "^4.1.0",
|
49 |
"dexie": "^4.0.11",
|
50 |
"dexie-react-hooks": "^1.1.7",
|
|
|
155 |
|
156 |
"@cfworker/json-schema": ["@cfworker/[email protected]", "", {}, "sha512-gAmrUZSGtKc3AiBL71iNWxDsyUC5uMaKKGdvzYsBoTW/xi42JQHl7eKV2OYzCUqvc+D2RCcf7EXY2iCyFIk6og=="],
|
157 |
|
158 |
+
"@clerk/clerk-react": ["@clerk/[email protected]", "", { "dependencies": { "@clerk/shared": "^2.22.0", "@clerk/types": "^4.46.1", "tslib": "2.4.1" }, "peerDependencies": { "react": "^18.0.0 || ^19.0.0 || ^19.0.0-0", "react-dom": "^18.0.0 || ^19.0.0 || ^19.0.0-0" } }, "sha512-2BMT3+KOWRJsAPDuhFoVdee2solukUcHw8BuKFfiOw6XXbU01H7WBezQCgF8gGFwrygEzZ0P5MBIz3L6MC/LYQ=="],
|
159 |
+
|
160 |
+
"@clerk/shared": ["@clerk/[email protected]", "", { "dependencies": { "@clerk/types": "^4.46.1", "dequal": "2.0.3", "glob-to-regexp": "0.4.1", "js-cookie": "3.0.5", "std-env": "^3.7.0", "swr": "^2.2.0" }, "peerDependencies": { "react": "^18.0.0 || ^19.0.0 || ^19.0.0-0", "react-dom": "^18.0.0 || ^19.0.0 || ^19.0.0-0" }, "optionalPeers": ["react", "react-dom"] }, "sha512-VWBeddOJVa3sqUPdvquaaQYw4h5hACSG3EUDOW7eSu2F6W3BXUozyLJQPBJ9C0MuoeHhOe/DeV8x2KqOgxVZaQ=="],
|
161 |
+
|
162 |
+
"@clerk/types": ["@clerk/[email protected]", "", { "dependencies": { "csstype": "3.1.3" } }, "sha512-QwomMjG1v2GqOITU2/rQUC11LFFsqNFUA92VKDo8dS1nN9iQadT6cNk7MZWqEYTxAfAM/IgX/gB00aGsLKaV8g=="],
|
163 |
+
|
164 |
"@esbuild/aix-ppc64": ["@esbuild/[email protected]", "", { "os": "aix", "cpu": "ppc64" }, "sha512-thpVCb/rhxE/BnMLQ7GReQLLN8q9qbHmI55F4489/ByVg2aQaQ6kbcLb6FHkocZzQhxc4gx0sCk0tJkKBFzDhA=="],
|
165 |
|
166 |
"@esbuild/android-arm": ["@esbuild/[email protected]", "", { "os": "android", "cpu": "arm" }, "sha512-tmwl4hJkCfNHwFB3nBa8z1Uy3ypZpxqxfTQOcHX+xRByyYgunVbZ9MzUUfb0RxaHIMnbHagwAxuTL+tnNM+1/Q=="],
|
|
|
241 |
|
242 |
"@hookform/resolvers": ["@hookform/[email protected]", "", { "dependencies": { "caniuse-lite": "^1.0.30001698" }, "peerDependencies": { "react-hook-form": "^7.0.0" } }, "sha512-S9YN1RgNWG+klUz5uQaV6rjE4pr6Py2tamj7ekshzLcMyg+/Pal1KZAYgGszV0+doiy41dUiQgXL3uRS9stndQ=="],
|
243 |
|
244 |
+
"@huggingface/inference": ["@huggingface/[email protected]", "", { "dependencies": { "@huggingface/tasks": "^0.12.9" } }, "sha512-EfsNtY9OR6JCNaUa5bZu2mrs48iqeTz0Gutwf+fU0Kypx33xFQB4DKMhp8u4Ee6qVbLbNWvTHuWwlppLQl4p4Q=="],
|
245 |
+
|
246 |
+
"@huggingface/tasks": ["@huggingface/[email protected]", "", {}, "sha512-A1ITdxbEzx9L8wKR8pF7swyrTLxWNDFIGDLUWInxvks2ruQ8PLRBZe8r0EcjC3CDdtlj9jV1V4cgV35K/iy3GQ=="],
|
247 |
+
|
248 |
"@humanfs/core": ["@humanfs/[email protected]", "", {}, "sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA=="],
|
249 |
|
250 |
"@humanfs/node": ["@humanfs/[email protected]", "", { "dependencies": { "@humanfs/core": "^0.19.1", "@humanwhocodes/retry": "^0.3.0" } }, "sha512-YuI2ZHQL78Q5HbhDiBA1X4LmYdXCKCMQIfw0pw7piHJwyREFebJUvrQN4cMssyES6x+vfUbx1CIpaQUKYdQZOw=="],
|
|
|
637 |
|
638 |
"clsx": ["[email protected]", "", {}, "sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA=="],
|
639 |
|
640 |
+
"cmdk": ["[email protected].4", "", { "dependencies": { "@radix-ui/react-dialog": "^1.1.2", "@radix-ui/react-id": "^1.1.0", "@radix-ui/react-primitive": "^2.0.0", "use-sync-external-store": "^1.2.2" }, "peerDependencies": { "react": "^18 || ^19 || ^19.0.0-rc", "react-dom": "^18 || ^19 || ^19.0.0-rc" } }, "sha512-AnsjfHyHpQ/EFeAnG216WY7A5LiYCoZzCSygiLvfXC3H3LFGCprErteUcszaVluGOhuOTbJS3jWHrSDYPBBygg=="],
|
641 |
|
642 |
"color-convert": ["[email protected]", "", { "dependencies": { "color-name": "~1.1.4" } }, "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ=="],
|
643 |
|
|
|
867 |
|
868 |
"glob-parent": ["[email protected]", "", { "dependencies": { "is-glob": "^4.0.3" } }, "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A=="],
|
869 |
|
870 |
+
"glob-to-regexp": ["[email protected]", "", {}, "sha512-lkX1HJXwyMcprw/5YUZc2s7DrpAiHB21/V+E1rHUrVNokkvB6bqMzT0VfV6/86ZNabt1k14YOIaT7nDvOX3Iiw=="],
|
871 |
+
|
872 |
"globals": ["[email protected]", "", {}, "sha512-7ACyT3wmyp3I61S4fG682L0VA2RGD9otkqGJIwNUMF1SWUombIIk+af1unuDYgMm082aHYwD+mzJvv9Iu8dsgg=="],
|
873 |
|
874 |
"globby": ["[email protected]", "", { "dependencies": { "array-union": "^2.1.0", "dir-glob": "^3.0.1", "fast-glob": "^3.2.9", "ignore": "^5.2.0", "merge2": "^1.4.1", "slash": "^3.0.0" } }, "sha512-jhIXaOzy1sb8IyocaruWSn1TjmnBVs8Ayhcy83rmxNJ8q2uWKCAj3CnJY+KpGSXCueAPc0i05kVvVKtP1t9S3g=="],
|
|
|
969 |
|
970 |
"jiti": ["[email protected]", "", { "bin": { "jiti": "bin/jiti.js" } }, "sha512-/imKNG4EbWNrVjoNC/1H5/9GFy+tqjGBHCaSsN+P2RnPqjsLmv6UD3Ej+Kj8nBWaRAwyk7kK5ZUc+OEatnTR3A=="],
|
971 |
|
972 |
+
"js-cookie": ["[email protected]", "", {}, "sha512-cEiJEAEoIbWfCZYKWhVwFuvPX1gETRYPw6LlaTKoxD3s2AkXzkCjnp6h0V77ozyqj0jakteJ4YqDJT830+lVGw=="],
|
973 |
+
|
974 |
"js-tiktoken": ["[email protected]", "", { "dependencies": { "base64-js": "^1.5.1" } }, "sha512-XC63YQeEcS47Y53gg950xiZ4IWmkfMe4p2V9OSaBt26q+p47WHn18izuXzSclCI73B7yGqtfRsT6jcZQI0y08g=="],
|
975 |
|
976 |
"js-tokens": ["[email protected]", "", {}, "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="],
|
|
|
1399 |
|
1400 |
"sprintf-js": ["[email protected]", "", {}, "sha512-D9cPgkvLlV3t3IzL0D0YLvGA9Ahk4PcvVwUbN0dSGr1aP0Nrt4AEnTUbuGvquEC0mA64Gqt1fzirlRs5ibXx8g=="],
|
1401 |
|
1402 |
+
"std-env": ["[email protected]", "", {}, "sha512-Bc3YwwCB+OzldMxOXJIIvC6cPRWr/LxOp48CdQTOkPyk/t4JWWJbrilwBd7RJzKV8QW7tJkcgAmeuLLJugl5/w=="],
|
1403 |
+
|
1404 |
"string-width": ["[email protected]", "", { "dependencies": { "eastasianwidth": "^0.2.0", "emoji-regex": "^9.2.2", "strip-ansi": "^7.0.1" } }, "sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA=="],
|
1405 |
|
1406 |
"string-width-cjs": ["[email protected]", "", { "dependencies": { "emoji-regex": "^8.0.0", "is-fullwidth-code-point": "^3.0.0", "strip-ansi": "^6.0.1" } }, "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g=="],
|
|
|
1429 |
|
1430 |
"supports-preserve-symlinks-flag": ["[email protected]", "", {}, "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w=="],
|
1431 |
|
1432 |
+
"swr": ["[email protected]", "", { "dependencies": { "dequal": "^2.0.3", "use-sync-external-store": "^1.4.0" }, "peerDependencies": { "react": "^16.11.0 || ^17.0.0 || ^18.0.0 || ^19.0.0" } }, "sha512-RosxFpiabojs75IwQ316DGoDRmOqtiAj0tg8wCcbEu4CiLZBs/a9QNtHV7TUfDXmmlgqij/NqzKq/eLelyv9xA=="],
|
1433 |
+
|
1434 |
"tailwind-merge": ["[email protected]", "", {}, "sha512-l7z+OYZ7mu3DTqrL88RiKrKIqO3NcpEO8V/Od04bNpvk0kiIFndGEoqfuzvj4yuhRkHKjRkII2z+KS2HfPcSxw=="],
|
1435 |
|
1436 |
"tailwindcss": ["[email protected]", "", { "dependencies": { "@alloc/quick-lru": "^5.2.0", "arg": "^5.0.2", "chokidar": "^3.6.0", "didyoumean": "^1.2.2", "dlv": "^1.1.3", "fast-glob": "^3.3.2", "glob-parent": "^6.0.2", "is-glob": "^4.0.3", "jiti": "^1.21.6", "lilconfig": "^3.1.3", "micromatch": "^4.0.8", "normalize-path": "^3.0.0", "object-hash": "^3.0.0", "picocolors": "^1.1.1", "postcss": "^8.4.47", "postcss-import": "^15.1.0", "postcss-js": "^4.0.1", "postcss-load-config": "^4.0.2", "postcss-nested": "^6.2.0", "postcss-selector-parser": "^6.1.2", "resolve": "^1.22.8", "sucrase": "^3.35.0" }, "bin": { "tailwind": "lib/cli.js", "tailwindcss": "lib/cli.js" } }, "sha512-w33E2aCvSDP0tW9RZuNXadXlkHXqFzSkQew/aIa2i/Sj8fThxwovwlXHSPXTbAHwEIhBFXAedUhP2tueAKP8Og=="],
|
|
|
1461 |
|
1462 |
"ts-interface-checker": ["[email protected]", "", {}, "sha512-Y/arvbn+rrz3JCKl9C4kVNfTfSm2/mEp5FSz5EsZSANGPSlQrpRI5M4PKF+mJnE52jOO90PnPSc3Ur3bTQw0gA=="],
|
1463 |
|
1464 |
+
"tslib": ["tslib@2.4.1", "", {}, "sha512-tGyy4dAjRIEwI7BzsB0lynWgOpfqjUdq91XXAlIWD2OwKBH7oCl/GZG/HT4BOHrTlPMOASlMQ7veyTqpmRcrNA=="],
|
1465 |
|
1466 |
"turbo-stream": ["[email protected]", "", {}, "sha512-FHncC10WpBd2eOmGwpmQsWLDoK4cqsA/UT/GqNoaKOQnT8uzhtCbg3EoUDMvqpOSAI0S26mr0rkjzbOO6S3v1g=="],
|
1467 |
|
|
|
1505 |
|
1506 |
"use-sidecar": ["[email protected]", "", { "dependencies": { "detect-node-es": "^1.1.0", "tslib": "^2.0.0" }, "peerDependencies": { "@types/react": "*", "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 || ^19.0.0-rc" }, "optionalPeers": ["@types/react"] }, "sha512-Fedw0aZvkhynoPYlA5WXrMCAMm+nSWdZt6lzJQ7Ok8S6Q+VsHmHpRWndVRJ8Be0ZbkfPc5LRYH+5XrzXcEeLRQ=="],
|
1507 |
|
1508 |
+
"use-sync-external-store": ["[email protected]", "", { "peerDependencies": { "react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0" } }, "sha512-9WXSPC5fMv61vaupRkCKCxsPxBocVnwakBEkMIHHpkTTg6icbJtg6jzgtLDm4bl3cSHAca52rYWih0k4K3PfHw=="],
|
1509 |
+
|
1510 |
"util-deprecate": ["[email protected]", "", {}, "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw=="],
|
1511 |
|
1512 |
"uuid": ["[email protected]", "", { "bin": { "uuid": "dist/bin/uuid" } }, "sha512-8XkAphELsDnEGrDxUOHB3RGvXz6TeuYSGEZBOjtTtPm2lwhGBjLgOzLHB63IUWfBpNucQjND6d3AOudO+H3RWQ=="],
|
|
|
1589 |
|
1590 |
"@typescript-eslint/typescript-estree/semver": ["[email protected]", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-hlq8tAfn0m/61p4BVRcPzIGr6LKiMwo4VM6dGi6pt4qcRkmNzTcWq6eCEjEh+qXjkMDvPlOFFSGwQjoEa6gyMA=="],
|
1591 |
|
1592 |
+
"aria-hidden/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1593 |
+
|
1594 |
"axios/form-data": ["[email protected]", "", { "dependencies": { "asynckit": "^0.4.0", "combined-stream": "^1.0.8", "es-set-tostringtag": "^2.1.0", "mime-types": "^2.1.12" } }, "sha512-hGfm/slu0ZabnNt4oaRZ6uREyfCj6P4fT/n6A1rGV+Z0VdGXjfOhVUpkn6qVQONHGIFwmveGXyDs75+nr6FM8w=="],
|
1595 |
|
1596 |
"browserfs/async": ["[email protected]", "", { "dependencies": { "lodash": "^4.17.14" } }, "sha512-mzo5dfJYwAn29PeiJ0zvwTo04zj8HDJj0Mn8TD7sno7q12prdbnasKJHhkm2c1LgrhlJ0teaea8860oxi51mGA=="],
|
|
|
1599 |
|
1600 |
"chokidar/glob-parent": ["[email protected]", "", { "dependencies": { "is-glob": "^4.0.1" } }, "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow=="],
|
1601 |
|
|
|
|
|
|
|
|
|
1602 |
"decode-named-character-reference/character-entities": ["[email protected]", "", {}, "sha512-shx7oQ0Awen/BRIdkjkvz54PnEEI/EjwXDSIZp86/KKdbafHh1Df/RYGBhn4hbe2+uKC9FnT5UCEdyPz3ai9hQ=="],
|
1603 |
|
1604 |
"fast-glob/glob-parent": ["[email protected]", "", { "dependencies": { "is-glob": "^4.0.1" } }, "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow=="],
|
|
|
1643 |
|
1644 |
"prop-types/react-is": ["[email protected]", "", {}, "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ=="],
|
1645 |
|
1646 |
+
"react-remove-scroll/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1647 |
+
|
1648 |
+
"react-remove-scroll-bar/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1649 |
+
|
1650 |
+
"react-style-singleton/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1651 |
+
|
1652 |
"readable-stream/safe-buffer": ["[email protected]", "", {}, "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="],
|
1653 |
|
1654 |
"readable-web-to-node-stream/readable-stream": ["[email protected]", "", { "dependencies": { "abort-controller": "^3.0.0", "buffer": "^6.0.3", "events": "^3.3.0", "process": "^0.11.10", "string_decoder": "^1.3.0" } }, "sha512-oIGGmcpTLwPga8Bn6/Z75SVaH1z5dUut2ibSyAMVhmUggWpmDn2dapB0n7f8nwaSiRtepAsfJyfXIO5DCVAODg=="],
|
1655 |
|
1656 |
"refractor/prismjs": ["[email protected]", "", {}, "sha512-t13BGPUlFDR7wRB5kQDG4jjl7XeuH6jbJGt11JHPL96qwsEHNX2+68tFXqc1/k+/jALsbSWJKUOT/hcYAZ5LkA=="],
|
1657 |
|
1658 |
+
"rxjs/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1659 |
+
|
1660 |
"string-width-cjs/emoji-regex": ["[email protected]", "", {}, "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="],
|
1661 |
|
1662 |
"string-width-cjs/strip-ansi": ["[email protected]", "", { "dependencies": { "ansi-regex": "^5.0.1" } }, "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A=="],
|
|
|
1675 |
|
1676 |
"trim-repeated/escape-string-regexp": ["[email protected]", "", {}, "sha512-vbRorB5FUQWvla16U8R/qgaFIya2qGzwDrNmCZuYKrbdSUMG6I1ZCGQRefkRVhuOkIGVne7BQ35DSfo1qvJqFg=="],
|
1677 |
|
1678 |
+
"use-callback-ref/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1679 |
+
|
1680 |
+
"use-sidecar/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1681 |
+
|
1682 |
"wrap-ansi/ansi-styles": ["[email protected]", "", {}, "sha512-bN798gFfQX+viw3R7yrGWRqnrN2oRkEkUjjl4JNn4E8GxxbjtG3FbrEIIY3l8/hrwUwIeCZvi4QuOTP4MErVug=="],
|
1683 |
|
1684 |
"wrap-ansi-cjs/ansi-styles": ["[email protected]", "", { "dependencies": { "color-convert": "^2.0.1" } }, "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg=="],
|
|
|
1687 |
|
1688 |
"wrap-ansi-cjs/strip-ansi": ["[email protected]", "", { "dependencies": { "ansi-regex": "^5.0.1" } }, "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A=="],
|
1689 |
|
1690 |
+
"youtubei.js/tslib": ["[email protected]", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
|
1691 |
+
|
1692 |
"@anthropic-ai/sdk/@types/node/undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
|
1693 |
|
1694 |
"@browserbasehq/sdk/@types/node/undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
|
|
|
1699 |
|
1700 |
"@typescript-eslint/typescript-estree/minimatch/brace-expansion": ["[email protected]", "", { "dependencies": { "balanced-match": "^1.0.0" } }, "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA=="],
|
1701 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1702 |
"glob/minimatch/brace-expansion": ["[email protected]", "", { "dependencies": { "balanced-match": "^1.0.0" } }, "sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA=="],
|
1703 |
|
1704 |
"hast-util-from-dom/hastscript/hast-util-parse-selector": ["[email protected]", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-wkQCkSYoOGCRKERFWcxMVMOcYE2K1AaNLU8DXS9arxnLOUEWbOXKXiJUNzEpqZ3JOKpnha3jkFrumEjVliDe7A=="],
|
|
|
1733 |
|
1734 |
"@browserbasehq/stagehand/@anthropic-ai/sdk/@types/node/undici-types": ["[email protected]", "", {}, "sha512-JlCMO+ehdEIKqlFxk6IfVoAUVmgz7cU7zD/h9XZ0qzeosSHmUJVOzSQvvYSYWXkFXC+IfLKSIffhv0sVZup6pA=="],
|
1735 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1736 |
"mdast-util-mdx-jsx/parse-entities/is-alphanumerical/is-alphabetical": ["[email protected]", "", {}, "sha512-FWyyY60MeTNyeSRpkM2Iry0G9hpr7/9kD40mD/cGQEuilcZYS4okz8SN2Q6rLCJ8gbCt6fN+rC+6tMGS99LaxQ=="],
|
1737 |
|
1738 |
"pkg-dir/find-up/locate-path/p-locate": ["[email protected]", "", { "dependencies": { "p-limit": "^2.2.0" } }, "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A=="],
|
package.json
CHANGED
@@ -13,7 +13,9 @@
|
|
13 |
"deploy": "bun run build && npx wrangler pages deploy dist"
|
14 |
},
|
15 |
"dependencies": {
|
|
|
16 |
"@hookform/resolvers": "^4.1.1",
|
|
|
17 |
"@langchain/anthropic": "^0.3.13",
|
18 |
"@langchain/community": "^0.3.32",
|
19 |
"@langchain/core": "^0.3.40",
|
|
|
13 |
"deploy": "bun run build && npx wrangler pages deploy dist"
|
14 |
},
|
15 |
"dependencies": {
|
16 |
+
"@clerk/clerk-react": "^5.23.0",
|
17 |
"@hookform/resolvers": "^4.1.1",
|
18 |
+
"@huggingface/inference": "2",
|
19 |
"@langchain/anthropic": "^0.3.13",
|
20 |
"@langchain/community": "^0.3.32",
|
21 |
"@langchain/core": "^0.3.40",
|
src/App.tsx
CHANGED
@@ -4,37 +4,45 @@ import { routes } from "@/routes";
|
|
4 |
import Layout from "@/layout";
|
5 |
import { Toaster } from "sonner";
|
6 |
import { ChatPage } from "./pages/chat/page";
|
|
|
|
|
7 |
|
8 |
function App() {
|
9 |
const queryClient = new QueryClient();
|
10 |
|
11 |
return (
|
12 |
<QueryClientProvider client={queryClient}>
|
13 |
-
<
|
14 |
-
<
|
15 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
<Route
|
17 |
-
key=
|
18 |
-
path=
|
19 |
element={
|
20 |
<Layout>
|
21 |
-
|
22 |
</Layout>
|
23 |
}
|
24 |
/>
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
}
|
34 |
-
/>
|
35 |
-
</Routes>
|
36 |
-
</BrowserRouter>
|
37 |
-
<Toaster />
|
38 |
</QueryClientProvider>
|
39 |
);
|
40 |
}
|
|
|
4 |
import Layout from "@/layout";
|
5 |
import { Toaster } from "sonner";
|
6 |
import { ChatPage } from "./pages/chat/page";
|
7 |
+
import { LoadingProvider } from "@/contexts/loading-context";
|
8 |
+
import { HuggingFaceCallback } from "./pages/integrations/huggingface-callback";
|
9 |
|
10 |
function App() {
|
11 |
const queryClient = new QueryClient();
|
12 |
|
13 |
return (
|
14 |
<QueryClientProvider client={queryClient}>
|
15 |
+
<LoadingProvider>
|
16 |
+
<BrowserRouter>
|
17 |
+
<Routes>
|
18 |
+
{routes.map((route) => (
|
19 |
+
<Route
|
20 |
+
key={route.path}
|
21 |
+
path={route.path}
|
22 |
+
element={
|
23 |
+
<Layout>
|
24 |
+
{route.component}
|
25 |
+
</Layout>
|
26 |
+
}
|
27 |
+
/>
|
28 |
+
))}
|
29 |
<Route
|
30 |
+
key="chat"
|
31 |
+
path="/chat/:id"
|
32 |
element={
|
33 |
<Layout>
|
34 |
+
<ChatPage />
|
35 |
</Layout>
|
36 |
}
|
37 |
/>
|
38 |
+
<Route
|
39 |
+
path="/integrations/huggingface-callback"
|
40 |
+
element={<HuggingFaceCallback />}
|
41 |
+
/>
|
42 |
+
</Routes>
|
43 |
+
</BrowserRouter>
|
44 |
+
<Toaster />
|
45 |
+
</LoadingProvider>
|
|
|
|
|
|
|
|
|
|
|
46 |
</QueryClientProvider>
|
47 |
);
|
48 |
}
|
src/components/sidebar/app-sidebar.tsx
CHANGED
@@ -18,13 +18,14 @@ import { useLiveQuery } from "dexie-react-hooks";
|
|
18 |
import { ChatHistoryDB } from "@/lib/chat/memory";
|
19 |
import { useState } from "react";
|
20 |
import { toast } from "sonner";
|
|
|
21 |
|
22 |
export function AppSidebar() {
|
23 |
const location = useLocation();
|
24 |
const navigate = useNavigate();
|
25 |
const chats = useLiveQuery(async () => {
|
26 |
-
const sessions = await
|
27 |
-
return sessions.sort((a, b) => b.updatedAt - a.updatedAt);
|
28 |
});
|
29 |
const [hoveringId, setHoveringId] = useState<string | null>(null);
|
30 |
|
@@ -63,7 +64,7 @@ export function AppSidebar() {
|
|
63 |
<SidebarGroupLabel>Chats</SidebarGroupLabel>
|
64 |
<SidebarGroupContent>
|
65 |
<SidebarMenu>
|
66 |
-
{chats?.map((chat) => (
|
67 |
<SidebarMenuItem key={chat.id}>
|
68 |
<SidebarMenuButton
|
69 |
asChild
|
@@ -85,7 +86,7 @@ export function AppSidebar() {
|
|
85 |
}`}
|
86 |
onClick={(e) => {
|
87 |
e.preventDefault();
|
88 |
-
|
89 |
toast.success("Chat deleted");
|
90 |
return navigate("/chat/new");
|
91 |
}}
|
|
|
18 |
import { ChatHistoryDB } from "@/lib/chat/memory";
|
19 |
import { useState } from "react";
|
20 |
import { toast } from "sonner";
|
21 |
+
import { IChatSession } from "@/lib/chat/types";
|
22 |
|
23 |
export function AppSidebar() {
|
24 |
const location = useLocation();
|
25 |
const navigate = useNavigate();
|
26 |
const chats = useLiveQuery(async () => {
|
27 |
+
const sessions = await ChatHistoryDB.getInstance().sessions.toArray();
|
28 |
+
return sessions.sort((a: IChatSession, b: IChatSession) => b.updatedAt - a.updatedAt);
|
29 |
});
|
30 |
const [hoveringId, setHoveringId] = useState<string | null>(null);
|
31 |
|
|
|
64 |
<SidebarGroupLabel>Chats</SidebarGroupLabel>
|
65 |
<SidebarGroupContent>
|
66 |
<SidebarMenu>
|
67 |
+
{chats?.map((chat: IChatSession) => (
|
68 |
<SidebarMenuItem key={chat.id}>
|
69 |
<SidebarMenuButton
|
70 |
asChild
|
|
|
86 |
}`}
|
87 |
onClick={(e) => {
|
88 |
e.preventDefault();
|
89 |
+
ChatHistoryDB.getInstance().sessions.delete(chat.id);
|
90 |
toast.success("Chat deleted");
|
91 |
return navigate("/chat/new");
|
92 |
}}
|
src/components/ui/loading-screen.tsx
ADDED
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
interface LoadingScreenProps {
|
2 |
+
message?: string;
|
3 |
+
}
|
4 |
+
|
5 |
+
export function LoadingScreen({ message = "Loading..." }: LoadingScreenProps) {
|
6 |
+
return (
|
7 |
+
<div className="fixed inset-0 flex flex-col items-center justify-center bg-background/80 backdrop-blur-sm z-50">
|
8 |
+
<div className="flex flex-col items-center gap-4 p-6 rounded-lg bg-card shadow-lg">
|
9 |
+
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-primary"></div>
|
10 |
+
<p className="text-lg font-medium text-foreground">{message}</p>
|
11 |
+
</div>
|
12 |
+
</div>
|
13 |
+
);
|
14 |
+
}
|
src/components/ui/spinner.tsx
ADDED
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { cn } from "@/lib/utils";
|
2 |
+
import { Loader2 } from "lucide-react";
|
3 |
+
|
4 |
+
export function Spinner({ className, ...props }: React.HTMLAttributes<HTMLDivElement>) {
|
5 |
+
return (
|
6 |
+
<div className={cn("animate-spin", className)} {...props}>
|
7 |
+
<Loader2 className="h-full w-full" />
|
8 |
+
</div>
|
9 |
+
);
|
10 |
+
}
|
src/contexts/loading-context.tsx
ADDED
@@ -0,0 +1,45 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { createContext, useContext, useState, ReactNode } from "react";
|
2 |
+
import { LoadingScreen } from "@/components/ui/loading-screen";
|
3 |
+
|
4 |
+
interface LoadingContextType {
|
5 |
+
isLoading: boolean;
|
6 |
+
message: string;
|
7 |
+
startLoading: (message?: string) => void;
|
8 |
+
stopLoading: () => void;
|
9 |
+
}
|
10 |
+
|
11 |
+
const LoadingContext = createContext<LoadingContextType | undefined>(undefined);
|
12 |
+
|
13 |
+
interface LoadingProviderProps {
|
14 |
+
children: ReactNode;
|
15 |
+
}
|
16 |
+
|
17 |
+
export function LoadingProvider({ children }: LoadingProviderProps) {
|
18 |
+
const [isLoading, setIsLoading] = useState(false);
|
19 |
+
const [message, setMessage] = useState("Loading...");
|
20 |
+
|
21 |
+
const startLoading = (message = "Loading...") => {
|
22 |
+
setMessage(message);
|
23 |
+
setIsLoading(true);
|
24 |
+
};
|
25 |
+
|
26 |
+
const stopLoading = () => {
|
27 |
+
setIsLoading(false);
|
28 |
+
};
|
29 |
+
|
30 |
+
return (
|
31 |
+
<LoadingContext.Provider value={{ isLoading, message, startLoading, stopLoading }}>
|
32 |
+
{children}
|
33 |
+
{isLoading && <LoadingScreen message={message} />}
|
34 |
+
</LoadingContext.Provider>
|
35 |
+
);
|
36 |
+
}
|
37 |
+
|
38 |
+
// eslint-disable-next-line react-refresh/only-export-components
|
39 |
+
export function useLoading() {
|
40 |
+
const context = useContext(LoadingContext);
|
41 |
+
if (context === undefined) {
|
42 |
+
throw new Error("useLoading must be used within a LoadingProvider");
|
43 |
+
}
|
44 |
+
return context;
|
45 |
+
}
|
src/hooks/use-chat.ts
ADDED
@@ -0,0 +1,89 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React from "react";
|
2 |
+
import { useLiveQuery } from "dexie-react-hooks";
|
3 |
+
import { ChatHistoryDB } from "@/lib/chat/memory";
|
4 |
+
import { HumanMessage, AIMessageChunk } from "@langchain/core/messages";
|
5 |
+
import { ChatManager } from "@/lib/chat/manager";
|
6 |
+
import { IDocument } from "@/lib/document/types";
|
7 |
+
import { toast } from "sonner";
|
8 |
+
import { IConfig } from "@/lib/config/types";
|
9 |
+
|
10 |
+
// Get the singleton instance of ChatHistoryDB
|
11 |
+
const chatHistoryDB = ChatHistoryDB.getInstance();
|
12 |
+
|
13 |
+
export const useChatSession = (id: string | undefined) => {
|
14 |
+
return useLiveQuery(async () => {
|
15 |
+
if (!id || id === "new") return null;
|
16 |
+
return await chatHistoryDB.sessions.get(id);
|
17 |
+
}, [id]);
|
18 |
+
};
|
19 |
+
|
20 |
+
export interface ChatSessionConfig {
|
21 |
+
config?: IConfig;
|
22 |
+
id?: string;
|
23 |
+
}
|
24 |
+
|
25 |
+
export const useSelectedModel = (id: string | undefined, config: IConfig | undefined) => {
|
26 |
+
const [selectedModel, setSelectedModel] = React.useState<string | null>(null);
|
27 |
+
|
28 |
+
useLiveQuery(async () => {
|
29 |
+
if (!config) return;
|
30 |
+
|
31 |
+
const model = !id || id === "new"
|
32 |
+
? config.default_chat_model
|
33 |
+
: (await chatHistoryDB.sessions.get(id))?.model ?? config.default_chat_model;
|
34 |
+
setSelectedModel(model);
|
35 |
+
}, [id, config]);
|
36 |
+
|
37 |
+
return [selectedModel, setSelectedModel, chatHistoryDB] as const;
|
38 |
+
};
|
39 |
+
|
40 |
+
// Use the singleton pattern for ChatManager
|
41 |
+
export const useChatManager = () => {
|
42 |
+
return React.useMemo(() => ChatManager.getInstance(), []);
|
43 |
+
};
|
44 |
+
|
45 |
+
export const generateMessage = async (
|
46 |
+
chatId: string | undefined,
|
47 |
+
input: string,
|
48 |
+
attachments: IDocument[],
|
49 |
+
isGenerating: boolean,
|
50 |
+
setIsGenerating: (isGenerating: boolean) => void,
|
51 |
+
setStreamingHumanMessage: (streamingHumanMessage: HumanMessage | null) => void,
|
52 |
+
setStreamingAIMessageChunks: React.Dispatch<React.SetStateAction<AIMessageChunk[]>>,
|
53 |
+
chatManager: ChatManager,
|
54 |
+
setInput: (input: string) => void,
|
55 |
+
setAttachments: (attachments: IDocument[]) => void
|
56 |
+
) => {
|
57 |
+
if (!chatId || isGenerating) return;
|
58 |
+
if (!input.trim() && !attachments.length) {
|
59 |
+
return;
|
60 |
+
}
|
61 |
+
|
62 |
+
try {
|
63 |
+
setIsGenerating(true);
|
64 |
+
|
65 |
+
const chatInput = input;
|
66 |
+
const chatAttachments = attachments;
|
67 |
+
|
68 |
+
setInput("");
|
69 |
+
setAttachments([]);
|
70 |
+
setStreamingHumanMessage(new HumanMessage(chatInput));
|
71 |
+
setStreamingAIMessageChunks([]);
|
72 |
+
|
73 |
+
const messageIterator = chatManager.chat(chatId, chatInput, chatAttachments);
|
74 |
+
|
75 |
+
for await (const event of messageIterator) {
|
76 |
+
if (event.type === "stream") {
|
77 |
+
setStreamingAIMessageChunks(prev => [...prev, event.content as AIMessageChunk]);
|
78 |
+
} else if (event.type === "end") {
|
79 |
+
setIsGenerating(false);
|
80 |
+
setStreamingHumanMessage(null);
|
81 |
+
setStreamingAIMessageChunks([]);
|
82 |
+
}
|
83 |
+
}
|
84 |
+
} catch (error) {
|
85 |
+
console.error(error);
|
86 |
+
toast.error(`Failed to send message: ${error}`);
|
87 |
+
setIsGenerating(false);
|
88 |
+
}
|
89 |
+
};
|
src/lib/chat/chat-hf.ts
ADDED
@@ -0,0 +1,23 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { ChatOpenAI } from "@langchain/openai";
|
2 |
+
|
3 |
+
export const ChatHFInference = ({
|
4 |
+
modelName,
|
5 |
+
apiKey
|
6 |
+
}: {
|
7 |
+
modelName: string;
|
8 |
+
apiKey: string;
|
9 |
+
}) => {
|
10 |
+
if (!apiKey) {
|
11 |
+
throw new Error("Hugging Face API token is required");
|
12 |
+
}
|
13 |
+
|
14 |
+
return new ChatOpenAI(
|
15 |
+
{
|
16 |
+
model: modelName,
|
17 |
+
apiKey: apiKey,
|
18 |
+
configuration: {
|
19 |
+
baseURL: "https://api-inference.huggingface.co/v1/"
|
20 |
+
}
|
21 |
+
},
|
22 |
+
);
|
23 |
+
}
|
src/lib/chat/manager.ts
CHANGED
@@ -17,91 +17,179 @@ import { DocumentManager } from "@/lib/document/manager";
|
|
17 |
import { Document } from "@langchain/core/documents";
|
18 |
import { HumanMessage, ToolMessage } from "@langchain/core/messages";
|
19 |
import { IChatSession } from "./types";
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
|
21 |
export class ChatManager {
|
22 |
model!: BaseChatModel;
|
23 |
-
embeddings!: Embeddings;
|
24 |
-
controller
|
25 |
-
configManager
|
26 |
config!: IConfig;
|
27 |
-
documentManager
|
|
|
28 |
|
29 |
constructor() {
|
30 |
this.controller = new AbortController();
|
31 |
-
this.
|
32 |
this.documentManager = DocumentManager.getInstance();
|
|
|
33 |
}
|
34 |
|
35 |
-
|
36 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
37 |
this.config = await this.configManager.getConfig();
|
38 |
}
|
39 |
|
|
|
|
|
|
|
|
|
40 |
private async getChatModel(modelName: string): Promise<BaseChatModel> {
|
|
|
|
|
|
|
|
|
|
|
41 |
const model = CHAT_MODELS.find(m => m.model === modelName);
|
42 |
|
43 |
if (!model) {
|
44 |
throw new Error(`Chat model ${modelName} not found`);
|
45 |
}
|
46 |
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
74 |
}
|
75 |
}
|
76 |
|
77 |
private async getEmbeddingModel(modelName: string): Promise<Embeddings> {
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
78 |
const model = EMBEDDING_MODELS.find(m => m.model === modelName);
|
79 |
|
80 |
if (!model) {
|
81 |
throw new Error(`Embedding model ${modelName} not found`);
|
82 |
}
|
83 |
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
105 |
}
|
106 |
}
|
107 |
|
@@ -298,6 +386,14 @@ export class ChatManager {
|
|
298 |
}
|
299 |
|
300 |
return content;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
301 |
}
|
302 |
};
|
303 |
|
@@ -335,7 +431,7 @@ export class ChatManager {
|
|
335 |
const chatSession = await memory.db.table("sessions").get(sessionId);
|
336 |
|
337 |
this.model = await this.getChatModel(chatSession?.model || this.config.default_chat_model);
|
338 |
-
this.embeddings = await this.getEmbeddingModel(chatSession?.embedding_model || this.config.default_embedding_model);
|
339 |
|
340 |
const agent = await this.getAgent(chatSession?.enabled_tools || []);
|
341 |
|
|
|
17 |
import { Document } from "@langchain/core/documents";
|
18 |
import { HumanMessage, ToolMessage } from "@langchain/core/messages";
|
19 |
import { IChatSession } from "./types";
|
20 |
+
import { ChatHFInference } from "./chat-hf";
|
21 |
+
|
22 |
+
// Define an error interface for better type safety
|
23 |
+
interface ErrorWithMessage {
|
24 |
+
message: string;
|
25 |
+
}
|
26 |
+
|
27 |
+
function isErrorWithMessage(error: unknown): error is ErrorWithMessage {
|
28 |
+
return (
|
29 |
+
typeof error === 'object' &&
|
30 |
+
error !== null &&
|
31 |
+
'message' in error &&
|
32 |
+
typeof (error as Record<string, unknown>).message === 'string'
|
33 |
+
);
|
34 |
+
}
|
35 |
+
|
36 |
+
function toErrorWithMessage(error: unknown): ErrorWithMessage {
|
37 |
+
if (isErrorWithMessage(error)) return error;
|
38 |
+
|
39 |
+
try {
|
40 |
+
return new Error(String(error));
|
41 |
+
} catch {
|
42 |
+
// fallback in case there's an error stringifying the error
|
43 |
+
return new Error('Unknown error');
|
44 |
+
}
|
45 |
+
}
|
46 |
+
|
47 |
+
function getErrorMessage(error: unknown): string {
|
48 |
+
return toErrorWithMessage(error).message;
|
49 |
+
}
|
50 |
|
51 |
export class ChatManager {
|
52 |
model!: BaseChatModel;
|
53 |
+
embeddings!: Embeddings | null;
|
54 |
+
controller: AbortController;
|
55 |
+
configManager: ConfigManager;
|
56 |
config!: IConfig;
|
57 |
+
documentManager: DocumentManager;
|
58 |
+
private static instance: ChatManager | null = null;
|
59 |
|
60 |
constructor() {
|
61 |
this.controller = new AbortController();
|
62 |
+
this.configManager = ConfigManager.getInstance();
|
63 |
this.documentManager = DocumentManager.getInstance();
|
64 |
+
this.initializeConfig();
|
65 |
}
|
66 |
|
67 |
+
public static getInstance(): ChatManager {
|
68 |
+
if (!ChatManager.instance) {
|
69 |
+
ChatManager.instance = new ChatManager();
|
70 |
+
}
|
71 |
+
return ChatManager.instance;
|
72 |
+
}
|
73 |
+
|
74 |
+
private async initializeConfig() {
|
75 |
this.config = await this.configManager.getConfig();
|
76 |
}
|
77 |
|
78 |
+
public resetController() {
|
79 |
+
this.controller = new AbortController();
|
80 |
+
}
|
81 |
+
|
82 |
private async getChatModel(modelName: string): Promise<BaseChatModel> {
|
83 |
+
// Ensure config is loaded
|
84 |
+
if (!this.config) {
|
85 |
+
await this.initializeConfig();
|
86 |
+
}
|
87 |
+
|
88 |
const model = CHAT_MODELS.find(m => m.model === modelName);
|
89 |
|
90 |
if (!model) {
|
91 |
throw new Error(`Chat model ${modelName} not found`);
|
92 |
}
|
93 |
|
94 |
+
try {
|
95 |
+
switch (model.provider) {
|
96 |
+
case PROVIDERS.ollama:
|
97 |
+
return new ChatOllama({
|
98 |
+
baseUrl: this.config.ollama_base_url,
|
99 |
+
model: model.model,
|
100 |
+
});
|
101 |
+
|
102 |
+
case PROVIDERS.openai:
|
103 |
+
return new ChatOpenAI({
|
104 |
+
modelName: this.config.openai_model && this.config.openai_model.trim() !== '' ? this.config.openai_model : model.model,
|
105 |
+
apiKey: this.config.openai_api_key,
|
106 |
+
configuration: {
|
107 |
+
baseURL: this.config.openai_base_url && this.config.openai_base_url.trim() !== '' ? this.config.openai_base_url : undefined,
|
108 |
+
}
|
109 |
+
});
|
110 |
+
|
111 |
+
case PROVIDERS.anthropic:
|
112 |
+
return new ChatAnthropic({
|
113 |
+
modelName: model.model,
|
114 |
+
apiKey: this.config.anthropic_api_key,
|
115 |
+
});
|
116 |
+
|
117 |
+
case PROVIDERS.gemini:
|
118 |
+
return new ChatGoogleGenerativeAI({
|
119 |
+
modelName: model.model,
|
120 |
+
apiKey: this.config.gemini_api_key,
|
121 |
+
});
|
122 |
+
|
123 |
+
case PROVIDERS.huggingface:
|
124 |
+
return ChatHFInference({
|
125 |
+
modelName: model.model,
|
126 |
+
apiKey: this.config.hf_token,
|
127 |
+
});
|
128 |
+
|
129 |
+
default:
|
130 |
+
throw new Error(`Provider ${model.provider} not implemented yet for chat models`);
|
131 |
+
}
|
132 |
+
} catch (error: unknown) {
|
133 |
+
console.error(`Error creating chat model ${modelName}:`, error);
|
134 |
+
throw new Error(`Failed to initialize chat model ${modelName}: ${getErrorMessage(error)}`);
|
135 |
}
|
136 |
}
|
137 |
|
138 |
private async getEmbeddingModel(modelName: string): Promise<Embeddings> {
|
139 |
+
// Ensure config is loaded
|
140 |
+
if (!this.config) {
|
141 |
+
await this.initializeConfig();
|
142 |
+
}
|
143 |
+
|
144 |
+
if (!modelName) {
|
145 |
+
throw new Error("No embedding model specified");
|
146 |
+
}
|
147 |
+
|
148 |
const model = EMBEDDING_MODELS.find(m => m.model === modelName);
|
149 |
|
150 |
if (!model) {
|
151 |
throw new Error(`Embedding model ${modelName} not found`);
|
152 |
}
|
153 |
|
154 |
+
// Check if trying to use Ollama when it's not available
|
155 |
+
if (model.provider === PROVIDERS.ollama) {
|
156 |
+
// Check if Ollama base URL is not configured
|
157 |
+
if (!this.config.ollama_base_url || this.config.ollama_base_url.trim() === '') {
|
158 |
+
throw new Error(`Ollama base URL is not configured. Please set a valid URL in the settings.`);
|
159 |
+
}
|
160 |
+
|
161 |
+
// Check if Ollama is not available
|
162 |
+
if (!this.config.ollama_available) {
|
163 |
+
throw new Error(`Ollama server is not available. Please check your connection to ${this.config.ollama_base_url}`);
|
164 |
+
}
|
165 |
+
}
|
166 |
+
|
167 |
+
try {
|
168 |
+
switch (model.provider) {
|
169 |
+
case PROVIDERS.ollama:
|
170 |
+
return new OllamaEmbeddings({
|
171 |
+
baseUrl: this.config.ollama_base_url,
|
172 |
+
model: model.model,
|
173 |
+
});
|
174 |
+
|
175 |
+
case PROVIDERS.openai:
|
176 |
+
return new OpenAIEmbeddings({
|
177 |
+
modelName: model.model,
|
178 |
+
apiKey: this.config.openai_api_key,
|
179 |
+
});
|
180 |
+
|
181 |
+
case PROVIDERS.gemini:
|
182 |
+
return new GoogleGenerativeAIEmbeddings({
|
183 |
+
modelName: model.model,
|
184 |
+
apiKey: this.config.gemini_api_key,
|
185 |
+
});
|
186 |
+
|
187 |
+
default:
|
188 |
+
throw new Error(`Provider ${model.provider} not implemented yet for embedding models`);
|
189 |
+
}
|
190 |
+
} catch (error: unknown) {
|
191 |
+
console.error(`Error creating embedding model ${modelName}:`, error);
|
192 |
+
throw new Error(`Failed to initialize embedding model ${modelName}: ${getErrorMessage(error)}`);
|
193 |
}
|
194 |
}
|
195 |
|
|
|
386 |
}
|
387 |
|
388 |
return content;
|
389 |
+
},
|
390 |
+
|
391 |
+
[PROVIDERS.huggingface]: async () => {
|
392 |
+
// Hugging Face Inference API primarily supports text
|
393 |
+
return processedContent.docs.map(doc => ({
|
394 |
+
type: "text",
|
395 |
+
text: `File name: ${doc.metadata.name}\nFile content: ${doc.pageContent}`
|
396 |
+
}));
|
397 |
}
|
398 |
};
|
399 |
|
|
|
431 |
const chatSession = await memory.db.table("sessions").get(sessionId);
|
432 |
|
433 |
this.model = await this.getChatModel(chatSession?.model || this.config.default_chat_model);
|
434 |
+
this.embeddings = await this.getEmbeddingModel(chatSession?.embedding_model || this.config.default_embedding_model || null);
|
435 |
|
436 |
const agent = await this.getAgent(chatSession?.enabled_tools || []);
|
437 |
|
src/lib/chat/memory.ts
CHANGED
@@ -11,15 +11,24 @@ import {
|
|
11 |
} from "@langchain/core/messages";
|
12 |
import { IConfig, CHAT_MODELS, EMBEDDING_MODELS } from "@/lib/config/types";
|
13 |
|
|
|
14 |
export class ChatHistoryDB extends Dexie {
|
15 |
sessions!: Dexie.Table<IChatSession, string>;
|
|
|
16 |
|
17 |
-
constructor() {
|
18 |
super("chat_history");
|
19 |
this.version(1).stores({
|
20 |
sessions: "id, title, createdAt, updatedAt, model, embedding_model, enabled_tools, messages",
|
21 |
});
|
22 |
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
23 |
}
|
24 |
|
25 |
export class DexieChatMemory extends BaseChatMessageHistory {
|
@@ -28,19 +37,19 @@ export class DexieChatMemory extends BaseChatMessageHistory {
|
|
28 |
sessionId: string;
|
29 |
chatHistory!: IChatSession | undefined;
|
30 |
config!: IConfig;
|
31 |
-
configManager
|
32 |
initialized: boolean = false;
|
33 |
|
34 |
constructor(sessionId: string) {
|
35 |
super();
|
36 |
this.sessionId = sessionId;
|
37 |
-
this.db =
|
|
|
38 |
}
|
39 |
|
40 |
async initialize() {
|
41 |
if (this.initialized) return;
|
42 |
|
43 |
-
this.configManager = await ConfigManager.getInstance();
|
44 |
this.config = await this.configManager.getConfig();
|
45 |
this.chatHistory = await this.db.sessions.get(this.sessionId);
|
46 |
|
@@ -49,7 +58,7 @@ export class DexieChatMemory extends BaseChatMessageHistory {
|
|
49 |
const embeddingModel = EMBEDDING_MODELS.find((m) => m.model === this.config.default_embedding_model);
|
50 |
|
51 |
if (!chatModel || !embeddingModel) {
|
52 |
-
throw new Error("
|
53 |
}
|
54 |
|
55 |
this.chatHistory = {
|
|
|
11 |
} from "@langchain/core/messages";
|
12 |
import { IConfig, CHAT_MODELS, EMBEDDING_MODELS } from "@/lib/config/types";
|
13 |
|
14 |
+
// Create a singleton instance of ChatHistoryDB
|
15 |
export class ChatHistoryDB extends Dexie {
|
16 |
sessions!: Dexie.Table<IChatSession, string>;
|
17 |
+
private static instance: ChatHistoryDB | null = null;
|
18 |
|
19 |
+
private constructor() {
|
20 |
super("chat_history");
|
21 |
this.version(1).stores({
|
22 |
sessions: "id, title, createdAt, updatedAt, model, embedding_model, enabled_tools, messages",
|
23 |
});
|
24 |
}
|
25 |
+
|
26 |
+
public static getInstance(): ChatHistoryDB {
|
27 |
+
if (!ChatHistoryDB.instance) {
|
28 |
+
ChatHistoryDB.instance = new ChatHistoryDB();
|
29 |
+
}
|
30 |
+
return ChatHistoryDB.instance;
|
31 |
+
}
|
32 |
}
|
33 |
|
34 |
export class DexieChatMemory extends BaseChatMessageHistory {
|
|
|
37 |
sessionId: string;
|
38 |
chatHistory!: IChatSession | undefined;
|
39 |
config!: IConfig;
|
40 |
+
configManager: ConfigManager;
|
41 |
initialized: boolean = false;
|
42 |
|
43 |
constructor(sessionId: string) {
|
44 |
super();
|
45 |
this.sessionId = sessionId;
|
46 |
+
this.db = ChatHistoryDB.getInstance();
|
47 |
+
this.configManager = ConfigManager.getInstance();
|
48 |
}
|
49 |
|
50 |
async initialize() {
|
51 |
if (this.initialized) return;
|
52 |
|
|
|
53 |
this.config = await this.configManager.getConfig();
|
54 |
this.chatHistory = await this.db.sessions.get(this.sessionId);
|
55 |
|
|
|
58 |
const embeddingModel = EMBEDDING_MODELS.find((m) => m.model === this.config.default_embedding_model);
|
59 |
|
60 |
if (!chatModel || !embeddingModel) {
|
61 |
+
throw new Error("Chat or embedding models are not configured.");
|
62 |
}
|
63 |
|
64 |
this.chatHistory = {
|
src/lib/chat/types.ts
CHANGED
@@ -9,4 +9,35 @@ export interface IChatSession {
|
|
9 |
model: string;
|
10 |
embedding_model: string;
|
11 |
enabled_tools: string[];
|
|
|
12 |
}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
model: string;
|
10 |
embedding_model: string;
|
11 |
enabled_tools: string[];
|
12 |
+
python_session_id?: string;
|
13 |
}
|
14 |
+
|
15 |
+
export type ToolType =
|
16 |
+
| "calculator"
|
17 |
+
| "python"
|
18 |
+
| "python_repl"
|
19 |
+
| "python_visualization"
|
20 |
+
| "python_data_processing";
|
21 |
+
|
22 |
+
export const AVAILABLE_TOOLS: Record<ToolType, { name: string; description: string }> = {
|
23 |
+
calculator: {
|
24 |
+
name: "Calculator",
|
25 |
+
description: "Perform mathematical calculations"
|
26 |
+
},
|
27 |
+
python: {
|
28 |
+
name: "Python",
|
29 |
+
description: "Execute Python code with data science libraries"
|
30 |
+
},
|
31 |
+
python_repl: {
|
32 |
+
name: "Python REPL",
|
33 |
+
description: "Interactive Python REPL with state persistence"
|
34 |
+
},
|
35 |
+
python_visualization: {
|
36 |
+
name: "Python Visualization",
|
37 |
+
description: "Generate visualizations using matplotlib, plotly, etc."
|
38 |
+
},
|
39 |
+
python_data_processing: {
|
40 |
+
name: "Python Data Processing",
|
41 |
+
description: "Process data using pandas, numpy, scikit-learn, etc."
|
42 |
+
}
|
43 |
+
};
|
src/lib/config/manager.ts
CHANGED
@@ -1,17 +1,18 @@
|
|
1 |
import Dexie, { Table } from "dexie";
|
2 |
-
import { IConfig, DEFAULT_CONFIG } from "./types";
|
3 |
|
4 |
export class ConfigManager extends Dexie {
|
5 |
configs!: Table<IConfig>;
|
6 |
private static instance: ConfigManager;
|
7 |
private currentConfig: IConfig | null = null;
|
|
|
8 |
|
9 |
private constructor() {
|
10 |
super("configs");
|
11 |
this.version(1).stores({
|
12 |
configs: "id, createdAt, updatedAt",
|
13 |
});
|
14 |
-
this.initialize();
|
15 |
}
|
16 |
|
17 |
public static getInstance(): ConfigManager {
|
@@ -21,30 +22,93 @@ export class ConfigManager extends Dexie {
|
|
21 |
return ConfigManager.instance;
|
22 |
}
|
23 |
|
24 |
-
async initialize() {
|
25 |
-
|
26 |
-
if (
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
39 |
}
|
40 |
-
return this.currentConfig;
|
41 |
}
|
42 |
|
43 |
-
async
|
44 |
-
if (!
|
45 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
46 |
}
|
47 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
48 |
}
|
49 |
|
50 |
async updateConfig(updates: Partial<Omit<IConfig, 'id' | 'createdAt' | 'updatedAt'>>): Promise<IConfig> {
|
@@ -57,6 +121,36 @@ export class ConfigManager extends Dexie {
|
|
57 |
|
58 |
await this.configs.put(updatedConfig);
|
59 |
this.currentConfig = updatedConfig;
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
60 |
return updatedConfig;
|
61 |
}
|
62 |
|
@@ -67,10 +161,20 @@ export class ConfigManager extends Dexie {
|
|
67 |
id: current.id,
|
68 |
createdAt: current.createdAt,
|
69 |
updatedAt: Date.now(),
|
|
|
70 |
};
|
71 |
|
|
|
|
|
|
|
|
|
|
|
72 |
await this.configs.put(resetConfig);
|
73 |
this.currentConfig = resetConfig;
|
|
|
|
|
|
|
|
|
74 |
return resetConfig;
|
75 |
}
|
76 |
}
|
|
|
1 |
import Dexie, { Table } from "dexie";
|
2 |
+
import { IConfig, DEFAULT_CONFIG, loadAllModels } from "./types";
|
3 |
|
4 |
export class ConfigManager extends Dexie {
|
5 |
configs!: Table<IConfig>;
|
6 |
private static instance: ConfigManager;
|
7 |
private currentConfig: IConfig | null = null;
|
8 |
+
private initPromise: Promise<IConfig> | null = null;
|
9 |
|
10 |
private constructor() {
|
11 |
super("configs");
|
12 |
this.version(1).stores({
|
13 |
configs: "id, createdAt, updatedAt",
|
14 |
});
|
15 |
+
this.initPromise = this.initialize();
|
16 |
}
|
17 |
|
18 |
public static getInstance(): ConfigManager {
|
|
|
22 |
return ConfigManager.instance;
|
23 |
}
|
24 |
|
25 |
+
async initialize(): Promise<IConfig> {
|
26 |
+
// Return existing config or in-progress initialization
|
27 |
+
if (this.currentConfig) return this.currentConfig;
|
28 |
+
if (this.initPromise) return this.initPromise;
|
29 |
+
|
30 |
+
try {
|
31 |
+
const configs = await this.configs.toArray();
|
32 |
+
|
33 |
+
if (configs.length === 0) {
|
34 |
+
// Create default config if none exists
|
35 |
+
const defaultConfig: IConfig = {
|
36 |
+
...DEFAULT_CONFIG,
|
37 |
+
id: crypto.randomUUID(),
|
38 |
+
createdAt: Date.now(),
|
39 |
+
updatedAt: Date.now(),
|
40 |
+
ollama_available: false
|
41 |
+
};
|
42 |
+
|
43 |
+
// Check Ollama availability if URL is provided
|
44 |
+
if (defaultConfig.ollama_base_url.trim() !== '') {
|
45 |
+
defaultConfig.ollama_available = await this.checkOllamaAvailability(defaultConfig.ollama_base_url);
|
46 |
+
}
|
47 |
+
|
48 |
+
await this.configs.add(defaultConfig);
|
49 |
+
this.currentConfig = defaultConfig;
|
50 |
+
} else {
|
51 |
+
// Use the most recently updated config
|
52 |
+
this.currentConfig = configs.sort((a, b) => b.updatedAt - a.updatedAt)[0];
|
53 |
+
|
54 |
+
// Check Ollama availability if URL is provided
|
55 |
+
if (this.currentConfig.ollama_base_url.trim() !== '') {
|
56 |
+
const ollamaAvailable = await this.checkOllamaAvailability(this.currentConfig.ollama_base_url);
|
57 |
+
|
58 |
+
// Update availability if changed
|
59 |
+
if (ollamaAvailable !== this.currentConfig.ollama_available) {
|
60 |
+
await this.updateConfig({ ollama_available: ollamaAvailable });
|
61 |
+
}
|
62 |
+
}
|
63 |
+
}
|
64 |
+
|
65 |
+
// Load models after config is ready
|
66 |
+
await this.loadModels();
|
67 |
+
return this.currentConfig;
|
68 |
+
} catch (error) {
|
69 |
+
console.error("Error initializing config:", error);
|
70 |
+
throw error;
|
71 |
+
} finally {
|
72 |
+
this.initPromise = null;
|
73 |
}
|
|
|
74 |
}
|
75 |
|
76 |
+
private async checkOllamaAvailability(baseUrl: string): Promise<boolean> {
|
77 |
+
if (!baseUrl || baseUrl.trim() === '') return false;
|
78 |
+
|
79 |
+
try {
|
80 |
+
const response = await fetch(`${baseUrl}/api/tags`);
|
81 |
+
if (!response.ok) return false;
|
82 |
+
|
83 |
+
const data = await response.json();
|
84 |
+
return Array.isArray(data.models) && data.models.length > 0;
|
85 |
+
} catch (error) {
|
86 |
+
console.error("Ollama server not available:", error);
|
87 |
+
return false;
|
88 |
}
|
89 |
+
}
|
90 |
+
|
91 |
+
private async loadModels(): Promise<void> {
|
92 |
+
if (!this.currentConfig) return;
|
93 |
+
|
94 |
+
try {
|
95 |
+
const { ollamaAvailable } = await loadAllModels(
|
96 |
+
this.currentConfig.ollama_base_url,
|
97 |
+
this.currentConfig.openai_model,
|
98 |
+
this.currentConfig.hf_custom_models
|
99 |
+
);
|
100 |
+
|
101 |
+
// Update availability if changed
|
102 |
+
if (ollamaAvailable !== this.currentConfig.ollama_available) {
|
103 |
+
await this.updateConfig({ ollama_available: ollamaAvailable });
|
104 |
+
}
|
105 |
+
} catch (error) {
|
106 |
+
console.error("Error loading models:", error);
|
107 |
+
}
|
108 |
+
}
|
109 |
+
|
110 |
+
async getConfig(): Promise<IConfig> {
|
111 |
+
return this.currentConfig || this.initialize();
|
112 |
}
|
113 |
|
114 |
async updateConfig(updates: Partial<Omit<IConfig, 'id' | 'createdAt' | 'updatedAt'>>): Promise<IConfig> {
|
|
|
121 |
|
122 |
await this.configs.put(updatedConfig);
|
123 |
this.currentConfig = updatedConfig;
|
124 |
+
|
125 |
+
// Handle Ollama URL updates
|
126 |
+
if (updates.ollama_base_url !== undefined &&
|
127 |
+
updates.ollama_base_url !== current.ollama_base_url) {
|
128 |
+
|
129 |
+
// Check availability only if URL is not empty
|
130 |
+
const ollamaAvailable = updates.ollama_base_url.trim() === ''
|
131 |
+
? false
|
132 |
+
: await this.checkOllamaAvailability(updates.ollama_base_url);
|
133 |
+
|
134 |
+
if (ollamaAvailable !== this.currentConfig.ollama_available) {
|
135 |
+
return this.updateConfig({ ollama_available: ollamaAvailable });
|
136 |
+
}
|
137 |
+
|
138 |
+
// Reload models if URL changed
|
139 |
+
await this.loadModels();
|
140 |
+
}
|
141 |
+
|
142 |
+
// Reload models if OpenAI custom models changed
|
143 |
+
if (updates.openai_model !== undefined &&
|
144 |
+
updates.openai_model !== current.openai_model) {
|
145 |
+
await this.loadModels();
|
146 |
+
}
|
147 |
+
|
148 |
+
// Reload models if Hugging Face custom models changed
|
149 |
+
if (updates.hf_custom_models !== undefined &&
|
150 |
+
updates.hf_custom_models !== current.hf_custom_models) {
|
151 |
+
await this.loadModels();
|
152 |
+
}
|
153 |
+
|
154 |
return updatedConfig;
|
155 |
}
|
156 |
|
|
|
161 |
id: current.id,
|
162 |
createdAt: current.createdAt,
|
163 |
updatedAt: Date.now(),
|
164 |
+
ollama_available: false
|
165 |
};
|
166 |
|
167 |
+
// Check Ollama availability if URL is provided
|
168 |
+
if (resetConfig.ollama_base_url.trim() !== '') {
|
169 |
+
resetConfig.ollama_available = await this.checkOllamaAvailability(resetConfig.ollama_base_url);
|
170 |
+
}
|
171 |
+
|
172 |
await this.configs.put(resetConfig);
|
173 |
this.currentConfig = resetConfig;
|
174 |
+
|
175 |
+
// Reload models after reset
|
176 |
+
await this.loadModels();
|
177 |
+
|
178 |
return resetConfig;
|
179 |
}
|
180 |
}
|
src/lib/config/types.ts
CHANGED
@@ -1,4 +1,4 @@
|
|
1 |
-
import ollama from "ollama/browser";
|
2 |
|
3 |
export interface IConfig {
|
4 |
id: string;
|
@@ -10,8 +10,14 @@ export interface IConfig {
|
|
10 |
enabled_embedding_models: string[];
|
11 |
ollama_base_url: string;
|
12 |
openai_api_key: string;
|
|
|
|
|
13 |
anthropic_api_key: string;
|
14 |
gemini_api_key: string;
|
|
|
|
|
|
|
|
|
15 |
}
|
16 |
|
17 |
export const DEFAULT_CONFIG: Omit<IConfig, 'id' | 'createdAt' | 'updatedAt'> = {
|
@@ -21,8 +27,14 @@ export const DEFAULT_CONFIG: Omit<IConfig, 'id' | 'createdAt' | 'updatedAt'> = {
|
|
21 |
enabled_embedding_models: [],
|
22 |
ollama_base_url: 'http://localhost:11434',
|
23 |
openai_api_key: '',
|
|
|
|
|
24 |
anthropic_api_key: '',
|
25 |
-
gemini_api_key: ''
|
|
|
|
|
|
|
|
|
26 |
};
|
27 |
|
28 |
export enum PROVIDERS {
|
@@ -30,6 +42,7 @@ export enum PROVIDERS {
|
|
30 |
anthropic = 'anthropic',
|
31 |
ollama = 'ollama',
|
32 |
gemini = 'gemini',
|
|
|
33 |
}
|
34 |
|
35 |
export enum MODALITIES {
|
@@ -40,10 +53,11 @@ export enum MODALITIES {
|
|
40 |
}
|
41 |
|
42 |
export const PROVIDERS_CONN_ARGS_MAP: Record<PROVIDERS, string[]> = {
|
43 |
-
[PROVIDERS.openai]: ['openai_api_key'],
|
44 |
[PROVIDERS.anthropic]: ['anthropic_api_key'],
|
45 |
[PROVIDERS.ollama]: ['ollama_base_url'],
|
46 |
[PROVIDERS.gemini]: ['gemini_api_key'],
|
|
|
47 |
};
|
48 |
|
49 |
export interface BaseModel {
|
@@ -61,7 +75,8 @@ export interface ChatModel extends BaseModel {
|
|
61 |
|
62 |
export type EmbeddingModel = BaseModel
|
63 |
|
64 |
-
|
|
|
65 |
{
|
66 |
name: 'GPT-4o',
|
67 |
provider: PROVIDERS.openai,
|
@@ -135,19 +150,26 @@ export const CHAT_MODELS: ChatModel[] = [
|
|
135 |
modalities: [MODALITIES.image, MODALITIES.pdf, MODALITIES.audio, MODALITIES.video],
|
136 |
isReasoning: false,
|
137 |
},
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
|
148 |
-
|
149 |
-
|
150 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
151 |
{
|
152 |
name: 'OpenAI Embeddings',
|
153 |
provider: PROVIDERS.openai,
|
@@ -160,12 +182,158 @@ export const EMBEDDING_MODELS: EmbeddingModel[] = [
|
|
160 |
model: 'text-embedding-005',
|
161 |
description: 'Google GenAI Embeddings',
|
162 |
},
|
163 |
-
|
164 |
-
|
165 |
-
|
166 |
-
|
167 |
-
|
168 |
-
|
169 |
-
|
170 |
-
|
171 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
// import ollama from "ollama/browser";
|
2 |
|
3 |
export interface IConfig {
|
4 |
id: string;
|
|
|
10 |
enabled_embedding_models: string[];
|
11 |
ollama_base_url: string;
|
12 |
openai_api_key: string;
|
13 |
+
openai_base_url: string;
|
14 |
+
openai_model: string;
|
15 |
anthropic_api_key: string;
|
16 |
gemini_api_key: string;
|
17 |
+
ollama_available: boolean;
|
18 |
+
hf_token: string;
|
19 |
+
hf_model: string;
|
20 |
+
hf_custom_models: string;
|
21 |
}
|
22 |
|
23 |
export const DEFAULT_CONFIG: Omit<IConfig, 'id' | 'createdAt' | 'updatedAt'> = {
|
|
|
27 |
enabled_embedding_models: [],
|
28 |
ollama_base_url: 'http://localhost:11434',
|
29 |
openai_api_key: '',
|
30 |
+
openai_base_url: '',
|
31 |
+
openai_model: '',
|
32 |
anthropic_api_key: '',
|
33 |
+
gemini_api_key: '',
|
34 |
+
ollama_available: false,
|
35 |
+
hf_token: '',
|
36 |
+
hf_model: '',
|
37 |
+
hf_custom_models: '',
|
38 |
};
|
39 |
|
40 |
export enum PROVIDERS {
|
|
|
42 |
anthropic = 'anthropic',
|
43 |
ollama = 'ollama',
|
44 |
gemini = 'gemini',
|
45 |
+
huggingface = 'huggingface',
|
46 |
}
|
47 |
|
48 |
export enum MODALITIES {
|
|
|
53 |
}
|
54 |
|
55 |
export const PROVIDERS_CONN_ARGS_MAP: Record<PROVIDERS, string[]> = {
|
56 |
+
[PROVIDERS.openai]: ['openai_api_key', 'openai_base_url', 'openai_model'],
|
57 |
[PROVIDERS.anthropic]: ['anthropic_api_key'],
|
58 |
[PROVIDERS.ollama]: ['ollama_base_url'],
|
59 |
[PROVIDERS.gemini]: ['gemini_api_key'],
|
60 |
+
[PROVIDERS.huggingface]: ['hf_token', 'hf_custom_models'],
|
61 |
};
|
62 |
|
63 |
export interface BaseModel {
|
|
|
75 |
|
76 |
export type EmbeddingModel = BaseModel
|
77 |
|
78 |
+
// Base models that are always available
|
79 |
+
export const BASE_CHAT_MODELS: ChatModel[] = [
|
80 |
{
|
81 |
name: 'GPT-4o',
|
82 |
provider: PROVIDERS.openai,
|
|
|
150 |
modalities: [MODALITIES.image, MODALITIES.pdf, MODALITIES.audio, MODALITIES.video],
|
151 |
isReasoning: false,
|
152 |
},
|
153 |
+
{
|
154 |
+
name: 'Hugging Face - Mistral',
|
155 |
+
provider: PROVIDERS.huggingface,
|
156 |
+
model: 'mistralai/Mistral-7B-Instruct-v0.2',
|
157 |
+
description: 'Mistral AI 7B Instruct model',
|
158 |
+
modalities: [],
|
159 |
+
isReasoning: false,
|
160 |
+
},
|
161 |
+
{
|
162 |
+
name: 'Hugging Face - Llama 2',
|
163 |
+
provider: PROVIDERS.huggingface,
|
164 |
+
model: 'meta-llama/Llama-2-7b-chat-hf',
|
165 |
+
description: 'Meta Llama 2 7B Chat',
|
166 |
+
modalities: [],
|
167 |
+
isReasoning: false,
|
168 |
+
},
|
169 |
+
];
|
170 |
+
|
171 |
+
// Base embedding models that are always available
|
172 |
+
export const BASE_EMBEDDING_MODELS: EmbeddingModel[] = [
|
173 |
{
|
174 |
name: 'OpenAI Embeddings',
|
175 |
provider: PROVIDERS.openai,
|
|
|
182 |
model: 'text-embedding-005',
|
183 |
description: 'Google GenAI Embeddings',
|
184 |
},
|
185 |
+
];
|
186 |
+
|
187 |
+
// Function to safely get custom OpenAI models
|
188 |
+
export async function getCustomOpenAIModels(openaiModel?: string): Promise<ChatModel[]> {
|
189 |
+
if (!openaiModel || openaiModel.trim() === '') {
|
190 |
+
return [];
|
191 |
+
}
|
192 |
+
|
193 |
+
// Split the comma-separated list of models into an array
|
194 |
+
const models = openaiModel.split(',').map(model => model.trim());
|
195 |
+
|
196 |
+
return models.map(model => ({
|
197 |
+
name: `Custom: ${model}`,
|
198 |
+
provider: PROVIDERS.openai,
|
199 |
+
model: model,
|
200 |
+
description: `Custom OpenAI model: ${model}`,
|
201 |
+
modalities: [MODALITIES.image],
|
202 |
+
isReasoning: false,
|
203 |
+
}));
|
204 |
+
}
|
205 |
+
|
206 |
+
// Function to safely get Ollama models
|
207 |
+
export async function getOllamaModels(baseUrl?: string): Promise<ChatModel[]> {
|
208 |
+
// Skip if no base URL is provided or it's empty
|
209 |
+
if (!baseUrl || baseUrl.trim() === '') {
|
210 |
+
return [];
|
211 |
+
}
|
212 |
+
|
213 |
+
try {
|
214 |
+
// We can't directly configure Ollama's base URL through the API
|
215 |
+
// Instead, we'll use the fetch API with the provided base URL
|
216 |
+
const response = await fetch(`${baseUrl}/api/tags`);
|
217 |
+
if (!response.ok) {
|
218 |
+
throw new Error(`Failed to connect to Ollama server: ${response.statusText}`);
|
219 |
+
}
|
220 |
+
|
221 |
+
const data = await response.json();
|
222 |
+
return data.models.map(
|
223 |
+
(model: { name: string }) => ({
|
224 |
+
name: model.name,
|
225 |
+
model: model.name,
|
226 |
+
description: 'Ollama model',
|
227 |
+
provider: PROVIDERS.ollama,
|
228 |
+
modalities: [],
|
229 |
+
isReasoning: false,
|
230 |
+
})
|
231 |
+
);
|
232 |
+
} catch (error) {
|
233 |
+
console.error("Failed to fetch Ollama models:", error);
|
234 |
+
return [];
|
235 |
+
}
|
236 |
+
}
|
237 |
+
|
238 |
+
// Function to safely get Ollama embedding models
|
239 |
+
export async function getOllamaEmbeddingModels(baseUrl?: string): Promise<EmbeddingModel[]> {
|
240 |
+
// Skip if no base URL is provided or it's empty
|
241 |
+
if (!baseUrl || baseUrl.trim() === '') {
|
242 |
+
return [];
|
243 |
+
}
|
244 |
+
|
245 |
+
try {
|
246 |
+
// We can't directly configure Ollama's base URL through the API
|
247 |
+
// Instead, we'll use the fetch API with the provided base URL
|
248 |
+
const response = await fetch(`${baseUrl}/api/tags`);
|
249 |
+
if (!response.ok) {
|
250 |
+
throw new Error(`Failed to connect to Ollama server: ${response.statusText}`);
|
251 |
+
}
|
252 |
+
|
253 |
+
const data = await response.json();
|
254 |
+
return data.models.map(
|
255 |
+
(model: { name: string }) => ({
|
256 |
+
name: model.name,
|
257 |
+
model: model.name,
|
258 |
+
description: 'Ollama Embeddings',
|
259 |
+
provider: PROVIDERS.ollama,
|
260 |
+
})
|
261 |
+
);
|
262 |
+
} catch (error) {
|
263 |
+
console.error("Failed to fetch Ollama embedding models:", error);
|
264 |
+
return [];
|
265 |
+
}
|
266 |
+
}
|
267 |
+
|
268 |
+
// Function to safely get custom HuggingFace models
|
269 |
+
export async function getCustomHuggingFaceModels(hfCustomModels?: string): Promise<ChatModel[]> {
|
270 |
+
if (!hfCustomModels || hfCustomModels.trim() === '') {
|
271 |
+
return [];
|
272 |
+
}
|
273 |
+
|
274 |
+
// Split the comma-separated list of models into an array
|
275 |
+
const models = hfCustomModels.split(',').map(model => model.trim());
|
276 |
+
|
277 |
+
return models.map(model => ({
|
278 |
+
name: `HF Custom: ${model.split('/').pop() || model}`,
|
279 |
+
provider: PROVIDERS.huggingface,
|
280 |
+
model: model,
|
281 |
+
description: `Custom Hugging Face model: ${model}`,
|
282 |
+
modalities: [MODALITIES.image],
|
283 |
+
isReasoning: false,
|
284 |
+
}));
|
285 |
+
}
|
286 |
+
|
287 |
+
// Initialize with base models
|
288 |
+
export let CHAT_MODELS: ChatModel[] = [...BASE_CHAT_MODELS];
|
289 |
+
export let EMBEDDING_MODELS: EmbeddingModel[] = [...BASE_EMBEDDING_MODELS];
|
290 |
+
|
291 |
+
// Function to load all models including Ollama if available
|
292 |
+
export async function loadAllModels(ollamaBaseUrl?: string, openaiModel?: string, hfCustomModels?: string) {
|
293 |
+
try {
|
294 |
+
// Get the Ollama base URL from the config if not provided
|
295 |
+
const configManager = (await import('./manager')).ConfigManager.getInstance();
|
296 |
+
const config = await configManager.getConfig();
|
297 |
+
if (ollamaBaseUrl === undefined) {
|
298 |
+
try {
|
299 |
+
ollamaBaseUrl = config.ollama_base_url;
|
300 |
+
} catch (error) {
|
301 |
+
console.error("Error getting config for Ollama base URL:", error);
|
302 |
+
ollamaBaseUrl = '';
|
303 |
+
}
|
304 |
+
}
|
305 |
+
|
306 |
+
// Skip loading Ollama models if base URL is not configured
|
307 |
+
let ollamaModels: ChatModel[] = [];
|
308 |
+
let ollamaEmbeddingModels: EmbeddingModel[] = [];
|
309 |
+
|
310 |
+
if (ollamaBaseUrl && ollamaBaseUrl.trim() !== '') {
|
311 |
+
ollamaModels = await getOllamaModels(ollamaBaseUrl);
|
312 |
+
ollamaEmbeddingModels = await getOllamaEmbeddingModels(ollamaBaseUrl);
|
313 |
+
}
|
314 |
+
|
315 |
+
const customOpenAIModels = await getCustomOpenAIModels(openaiModel || config.openai_model);
|
316 |
+
const customHuggingFaceModels = await getCustomHuggingFaceModels(hfCustomModels || config.hf_custom_models);
|
317 |
+
|
318 |
+
CHAT_MODELS = [
|
319 |
+
...BASE_CHAT_MODELS,
|
320 |
+
...ollamaModels,
|
321 |
+
...customOpenAIModels,
|
322 |
+
...customHuggingFaceModels
|
323 |
+
];
|
324 |
+
EMBEDDING_MODELS = [...BASE_EMBEDDING_MODELS, ...ollamaEmbeddingModels];
|
325 |
+
|
326 |
+
return {
|
327 |
+
ollamaAvailable: ollamaModels.length > 0,
|
328 |
+
chatModels: CHAT_MODELS,
|
329 |
+
embeddingModels: EMBEDDING_MODELS
|
330 |
+
};
|
331 |
+
} catch (error) {
|
332 |
+
console.error("Error loading models:", error);
|
333 |
+
return {
|
334 |
+
ollamaAvailable: false,
|
335 |
+
chatModels: BASE_CHAT_MODELS,
|
336 |
+
embeddingModels: BASE_EMBEDDING_MODELS
|
337 |
+
};
|
338 |
+
}
|
339 |
+
}
|
src/lib/document/manager.ts
CHANGED
@@ -20,6 +20,8 @@ export class DocumentManager extends Dexie {
|
|
20 |
private static instance: DocumentManager;
|
21 |
fs!: typeof import("fs");
|
22 |
documents!: Table<IDocument>;
|
|
|
|
|
23 |
|
24 |
public static getInstance(): DocumentManager {
|
25 |
if (!DocumentManager.instance) {
|
@@ -33,9 +35,12 @@ export class DocumentManager extends Dexie {
|
|
33 |
this.version(1).stores({
|
34 |
documents: "id, name, path, type, createdAt",
|
35 |
});
|
36 |
-
this.initialize();
|
37 |
}
|
|
|
38 |
private async initialize() {
|
|
|
|
|
39 |
await new Promise<void>((resolve) => {
|
40 |
configure({
|
41 |
fs: "IndexedDB",
|
@@ -45,6 +50,14 @@ export class DocumentManager extends Dexie {
|
|
45 |
resolve();
|
46 |
});
|
47 |
});
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
48 |
}
|
49 |
|
50 |
getDocumentType(mimeType: string): DocumentType {
|
@@ -77,6 +90,8 @@ export class DocumentManager extends Dexie {
|
|
77 |
}
|
78 |
|
79 |
async uploadDocument(file: File) {
|
|
|
|
|
80 |
const fileId = crypto.randomUUID();
|
81 |
const fileName = `${fileId}.${file.type.split("/")[1]}`;
|
82 |
const filePath = `documents/${fileName}`;
|
@@ -115,6 +130,8 @@ export class DocumentManager extends Dexie {
|
|
115 |
}
|
116 |
|
117 |
async uploadUrl(url: string) {
|
|
|
|
|
118 |
const fileId = crypto.randomUUID();
|
119 |
const createdAt = Date.now();
|
120 |
let newDoc: IDocument;
|
@@ -140,6 +157,8 @@ export class DocumentManager extends Dexie {
|
|
140 |
}
|
141 |
|
142 |
async getDocument(id: string) {
|
|
|
|
|
143 |
const file = await this.documents.get(id);
|
144 |
if (!file) {
|
145 |
throw new Error("Document not found");
|
@@ -156,12 +175,13 @@ export class DocumentManager extends Dexie {
|
|
156 |
}
|
157 |
|
158 |
async loadDocument(id: string) {
|
|
|
|
|
159 |
const file = await this.documents.get(id);
|
160 |
if (!file) {
|
161 |
throw new Error("Document not found");
|
162 |
}
|
163 |
const type = file.type;
|
164 |
-
console.log(type)
|
165 |
let loader: DocumentLoader;
|
166 |
switch (type) {
|
167 |
case "pdf":
|
|
|
20 |
private static instance: DocumentManager;
|
21 |
fs!: typeof import("fs");
|
22 |
documents!: Table<IDocument>;
|
23 |
+
private initialized: boolean = false;
|
24 |
+
private initPromise: Promise<void> | null = null;
|
25 |
|
26 |
public static getInstance(): DocumentManager {
|
27 |
if (!DocumentManager.instance) {
|
|
|
35 |
this.version(1).stores({
|
36 |
documents: "id, name, path, type, createdAt",
|
37 |
});
|
38 |
+
this.initPromise = this.initialize();
|
39 |
}
|
40 |
+
|
41 |
private async initialize() {
|
42 |
+
if (this.initialized) return;
|
43 |
+
|
44 |
await new Promise<void>((resolve) => {
|
45 |
configure({
|
46 |
fs: "IndexedDB",
|
|
|
50 |
resolve();
|
51 |
});
|
52 |
});
|
53 |
+
|
54 |
+
this.initialized = true;
|
55 |
+
}
|
56 |
+
|
57 |
+
private async ensureInitialized() {
|
58 |
+
if (!this.initialized) {
|
59 |
+
await this.initPromise;
|
60 |
+
}
|
61 |
}
|
62 |
|
63 |
getDocumentType(mimeType: string): DocumentType {
|
|
|
90 |
}
|
91 |
|
92 |
async uploadDocument(file: File) {
|
93 |
+
await this.ensureInitialized();
|
94 |
+
|
95 |
const fileId = crypto.randomUUID();
|
96 |
const fileName = `${fileId}.${file.type.split("/")[1]}`;
|
97 |
const filePath = `documents/${fileName}`;
|
|
|
130 |
}
|
131 |
|
132 |
async uploadUrl(url: string) {
|
133 |
+
await this.ensureInitialized();
|
134 |
+
|
135 |
const fileId = crypto.randomUUID();
|
136 |
const createdAt = Date.now();
|
137 |
let newDoc: IDocument;
|
|
|
157 |
}
|
158 |
|
159 |
async getDocument(id: string) {
|
160 |
+
await this.ensureInitialized();
|
161 |
+
|
162 |
const file = await this.documents.get(id);
|
163 |
if (!file) {
|
164 |
throw new Error("Document not found");
|
|
|
175 |
}
|
176 |
|
177 |
async loadDocument(id: string) {
|
178 |
+
await this.ensureInitialized();
|
179 |
+
|
180 |
const file = await this.documents.get(id);
|
181 |
if (!file) {
|
182 |
throw new Error("Document not found");
|
183 |
}
|
184 |
const type = file.type;
|
|
|
185 |
let loader: DocumentLoader;
|
186 |
switch (type) {
|
187 |
case "pdf":
|
src/pages/chat/components/AIMessage.tsx
CHANGED
@@ -21,7 +21,7 @@ export const AIMessageComponent = React.memo(({ message }: MessageProps) => {
|
|
21 |
}, [message.content]);
|
22 |
|
23 |
return (
|
24 |
-
<div className="flex flex-col gap-
|
25 |
<div className="prose prose-sm dark:prose-invert max-w-none">
|
26 |
<ReactMarkdown
|
27 |
remarkPlugins={[remarkMath, remarkGfm]}
|
|
|
21 |
}, [message.content]);
|
22 |
|
23 |
return (
|
24 |
+
<div className="flex flex-col gap-1 group">
|
25 |
<div className="prose prose-sm dark:prose-invert max-w-none">
|
26 |
<ReactMarkdown
|
27 |
remarkPlugins={[remarkMath, remarkGfm]}
|
src/pages/chat/components/ModelSelector.tsx
CHANGED
@@ -23,9 +23,9 @@ export const ModelSelector = React.memo(({
|
|
23 |
<DropdownMenuTrigger asChild>
|
24 |
<Button
|
25 |
variant="ghost"
|
26 |
-
className="w-[200px] h-8 p-1 justify-start font-normal"
|
27 |
>
|
28 |
-
{selectedModelName}
|
29 |
</Button>
|
30 |
</DropdownMenuTrigger>
|
31 |
<DropdownMenuContent className="w-[400px]">
|
|
|
23 |
<DropdownMenuTrigger asChild>
|
24 |
<Button
|
25 |
variant="ghost"
|
26 |
+
className="w-[200px] h-8 p-1 justify-start font-normal truncate"
|
27 |
>
|
28 |
+
<span className="truncate">{selectedModelName}</span>
|
29 |
</Button>
|
30 |
</DropdownMenuTrigger>
|
31 |
<DropdownMenuContent className="w-[400px]">
|
src/pages/chat/hooks.ts
CHANGED
@@ -1,74 +1,4 @@
|
|
1 |
-
import
|
2 |
-
import { useLiveQuery } from "dexie-react-hooks";
|
3 |
-
import { ChatHistoryDB } from "@/lib/chat/memory";
|
4 |
-
import { Config, ChatSession } from "./types";
|
5 |
-
import { HumanMessage, AIMessageChunk } from "@langchain/core/messages";
|
6 |
-
import { ChatManager } from "@/lib/chat/manager";
|
7 |
-
import { IDocument } from "@/lib/document/types";
|
8 |
-
import { toast } from "sonner";
|
9 |
|
10 |
-
export
|
11 |
-
|
12 |
-
return useLiveQuery(async () => {
|
13 |
-
if (!id || id === "new") return null;
|
14 |
-
return await chatHistoryDB.sessions.get(id);
|
15 |
-
}, [id]);
|
16 |
-
};
|
17 |
-
|
18 |
-
export const useSelectedModel = (id: string | undefined, config: Config | undefined) => {
|
19 |
-
const [selectedModel, setSelectedModel] = React.useState<string | null>(null);
|
20 |
-
const chatHistoryDB = React.useMemo(() => new ChatHistoryDB(), []);
|
21 |
-
|
22 |
-
useLiveQuery(async () => {
|
23 |
-
if (!config) return;
|
24 |
-
|
25 |
-
const model = !id || id === "new"
|
26 |
-
? config.default_chat_model
|
27 |
-
: (await chatHistoryDB.sessions.get(id) as ChatSession | undefined)?.model ?? config.default_chat_model;
|
28 |
-
setSelectedModel(model);
|
29 |
-
}, [id, config]);
|
30 |
-
|
31 |
-
return [selectedModel, setSelectedModel, chatHistoryDB] as const;
|
32 |
-
};
|
33 |
-
|
34 |
-
export const generateMessage = async (
|
35 |
-
chatId: string | undefined,
|
36 |
-
input: string,
|
37 |
-
attachments: IDocument[],
|
38 |
-
isGenerating: boolean,
|
39 |
-
setIsGenerating: (isGenerating: boolean) => void,
|
40 |
-
setStreamingHumanMessage: (streamingHumanMessage: HumanMessage | null) => void,
|
41 |
-
setStreamingAIMessageChunks: React.Dispatch<React.SetStateAction<AIMessageChunk[]>>,
|
42 |
-
chatManager: ChatManager,
|
43 |
-
setInput: (input: string) => void,
|
44 |
-
setAttachments: (attachments: IDocument[]) => void
|
45 |
-
) => {
|
46 |
-
if (!chatId || !input.trim() || isGenerating) return;
|
47 |
-
|
48 |
-
try {
|
49 |
-
setIsGenerating(true);
|
50 |
-
|
51 |
-
const chatInput = input;
|
52 |
-
const chatAttachments = attachments;
|
53 |
-
|
54 |
-
setInput("");
|
55 |
-
setAttachments([]);
|
56 |
-
setStreamingHumanMessage(new HumanMessage(chatInput));
|
57 |
-
|
58 |
-
const messageIterator = chatManager.chat(chatId, chatInput, chatAttachments);
|
59 |
-
|
60 |
-
for await (const event of messageIterator) {
|
61 |
-
if (event.type === "stream") {
|
62 |
-
setStreamingAIMessageChunks(prev => [...prev, event.content as AIMessageChunk]);
|
63 |
-
} else if (event.type === "end") {
|
64 |
-
setIsGenerating(false);
|
65 |
-
setStreamingHumanMessage(null);
|
66 |
-
setStreamingAIMessageChunks([]);
|
67 |
-
}
|
68 |
-
}
|
69 |
-
} catch (error) {
|
70 |
-
console.error(error);
|
71 |
-
toast.error(`Failed to send message: ${error}`);
|
72 |
-
setIsGenerating(false);
|
73 |
-
}
|
74 |
-
};
|
|
|
1 |
+
import { useChatSession, useSelectedModel, generateMessage, useChatManager } from "@/hooks/use-chat";
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
|
3 |
+
// Re-export the hooks
|
4 |
+
export { useChatSession, useSelectedModel, generateMessage, useChatManager };
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
src/pages/chat/page.tsx
CHANGED
@@ -1,27 +1,30 @@
|
|
1 |
-
import React from "react";
|
2 |
import { useParams, useNavigate } from "react-router-dom";
|
3 |
-
import { ChatManager } from "@/lib/chat/manager";
|
4 |
import { useLiveQuery } from "dexie-react-hooks";
|
5 |
import { mapStoredMessageToChatMessage } from "@langchain/core/messages";
|
6 |
-
import { DexieChatMemory } from "@/lib/chat/memory";
|
7 |
import { ConfigManager } from "@/lib/config/manager";
|
8 |
-
import { DocumentManager } from "@/lib/document/manager";
|
9 |
import { toast } from "sonner";
|
10 |
import { Messages } from "./components/Messages";
|
11 |
import { Input } from "./components/Input";
|
12 |
import { FilePreviewDialog } from "./components/FilePreviewDialog";
|
13 |
-
import { useChatSession, useSelectedModel, generateMessage } from "
|
14 |
-
import { CHAT_MODELS } from "@/lib/config/types";
|
15 |
import { IDocument } from "@/lib/document/types";
|
16 |
import { HumanMessage } from "@langchain/core/messages";
|
17 |
import { AIMessageChunk } from "@langchain/core/messages";
|
|
|
|
|
|
|
|
|
18 |
|
19 |
export function ChatPage() {
|
20 |
const { id } = useParams();
|
21 |
const navigate = useNavigate();
|
|
|
22 |
|
|
|
23 |
const configManager = React.useMemo(() => ConfigManager.getInstance(), []);
|
24 |
-
const chatManager =
|
25 |
const documentManager = React.useMemo(() => DocumentManager.getInstance(), []);
|
26 |
|
27 |
const [input, setInput] = React.useState("");
|
@@ -33,15 +36,30 @@ export function ChatPage() {
|
|
33 |
const [streamingHumanMessage, setStreamingHumanMessage] = React.useState<HumanMessage | null>(null);
|
34 |
const [streamingAIMessageChunks, setStreamingAIMessageChunks] = React.useState<AIMessageChunk[]>([]);
|
35 |
const [editingMessageIndex, setEditingMessageIndex] = React.useState<number | null>(null);
|
|
|
36 |
|
37 |
const config = useLiveQuery(async () => await configManager.getConfig());
|
38 |
const chatSession = useChatSession(id);
|
39 |
const [selectedModel, setSelectedModel, chatHistoryDB] = useSelectedModel(id, config);
|
40 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
const selectedModelName = React.useMemo(() => (
|
42 |
CHAT_MODELS.find(model => model.model === selectedModel)?.name || "Select a model"
|
43 |
), [selectedModel]);
|
44 |
|
|
|
|
|
|
|
|
|
|
|
45 |
const handleModelChange = React.useCallback(async (model: string) => {
|
46 |
if (!config) return;
|
47 |
|
@@ -61,29 +79,69 @@ export function ChatPage() {
|
|
61 |
}
|
62 |
}
|
63 |
setSelectedModel(model);
|
|
|
64 |
}, [config, id, setSelectedModel, configManager, chatHistoryDB.sessions]);
|
65 |
|
66 |
const handleSendMessage = React.useCallback(async () => {
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
67 |
let chatId = id;
|
68 |
let isNewChat = false;
|
69 |
if (id === "new") {
|
70 |
chatId = crypto.randomUUID();
|
71 |
-
new DexieChatMemory(chatId);
|
72 |
isNewChat = true;
|
73 |
navigate(`/chat/${chatId}`, { replace: true });
|
74 |
}
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
81 |
);
|
82 |
-
|
83 |
-
|
84 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
85 |
}
|
86 |
-
}, [id, input, attachments, isGenerating, chatManager, navigate, chatHistoryDB.sessions]);
|
87 |
|
88 |
const handleAttachmentFileUpload = React.useCallback(async (event: React.ChangeEvent<HTMLInputElement>) => {
|
89 |
const files = event.target.files;
|
@@ -128,59 +186,144 @@ export function ChatPage() {
|
|
128 |
const handleSaveEdit = React.useCallback(async (content: string) => {
|
129 |
if (!id || editingMessageIndex === null || !chatSession || isGenerating) return;
|
130 |
|
131 |
-
//
|
132 |
-
|
133 |
-
updatedMessages[editingMessageIndex] = {
|
134 |
-
...updatedMessages[editingMessageIndex],
|
135 |
-
data: {
|
136 |
-
...updatedMessages[editingMessageIndex].data,
|
137 |
-
content
|
138 |
-
}
|
139 |
-
};
|
140 |
-
// Remove messages after the edited message
|
141 |
-
const newMessages = updatedMessages.slice(0, editingMessageIndex);
|
142 |
-
|
143 |
-
await chatHistoryDB.sessions.update(id, {
|
144 |
-
...chatSession,
|
145 |
-
messages: newMessages,
|
146 |
-
updatedAt: Date.now()
|
147 |
-
});
|
148 |
|
149 |
-
|
150 |
-
|
151 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
152 |
|
153 |
-
|
154 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
155 |
|
156 |
const handleRegenerateMessage = React.useCallback(async (index: number) => {
|
157 |
if (!id || !chatSession || isGenerating) return;
|
158 |
|
159 |
-
|
160 |
-
|
161 |
|
162 |
-
|
163 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
164 |
|
165 |
-
|
166 |
-
|
167 |
-
|
168 |
-
|
169 |
-
|
170 |
-
|
171 |
-
|
172 |
-
|
173 |
-
|
174 |
-
|
175 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
176 |
|
177 |
const stopGenerating = React.useCallback(() => {
|
178 |
chatManager.controller.abort();
|
179 |
setIsGenerating(false);
|
180 |
-
}, [chatManager
|
181 |
|
182 |
return (
|
183 |
<div className="flex flex-col h-screen p-2">
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
184 |
<Messages
|
185 |
messages={chatSession?.messages.map(mapStoredMessageToChatMessage)}
|
186 |
streamingHumanMessage={streamingHumanMessage}
|
|
|
1 |
+
import React, { useEffect } from "react";
|
2 |
import { useParams, useNavigate } from "react-router-dom";
|
|
|
3 |
import { useLiveQuery } from "dexie-react-hooks";
|
4 |
import { mapStoredMessageToChatMessage } from "@langchain/core/messages";
|
|
|
5 |
import { ConfigManager } from "@/lib/config/manager";
|
|
|
6 |
import { toast } from "sonner";
|
7 |
import { Messages } from "./components/Messages";
|
8 |
import { Input } from "./components/Input";
|
9 |
import { FilePreviewDialog } from "./components/FilePreviewDialog";
|
10 |
+
import { useChatSession, useSelectedModel, generateMessage, useChatManager } from "@/hooks/use-chat";
|
11 |
+
import { CHAT_MODELS, PROVIDERS } from "@/lib/config/types";
|
12 |
import { IDocument } from "@/lib/document/types";
|
13 |
import { HumanMessage } from "@langchain/core/messages";
|
14 |
import { AIMessageChunk } from "@langchain/core/messages";
|
15 |
+
import { DocumentManager } from "@/lib/document/manager";
|
16 |
+
import { useLoading } from "@/contexts/loading-context";
|
17 |
+
import { Alert, AlertTitle, AlertDescription } from "@/components/ui/alert";
|
18 |
+
import { AlertCircle } from "lucide-react";
|
19 |
|
20 |
export function ChatPage() {
|
21 |
const { id } = useParams();
|
22 |
const navigate = useNavigate();
|
23 |
+
const { startLoading, stopLoading } = useLoading();
|
24 |
|
25 |
+
// Use singleton instances
|
26 |
const configManager = React.useMemo(() => ConfigManager.getInstance(), []);
|
27 |
+
const chatManager = useChatManager();
|
28 |
const documentManager = React.useMemo(() => DocumentManager.getInstance(), []);
|
29 |
|
30 |
const [input, setInput] = React.useState("");
|
|
|
36 |
const [streamingHumanMessage, setStreamingHumanMessage] = React.useState<HumanMessage | null>(null);
|
37 |
const [streamingAIMessageChunks, setStreamingAIMessageChunks] = React.useState<AIMessageChunk[]>([]);
|
38 |
const [editingMessageIndex, setEditingMessageIndex] = React.useState<number | null>(null);
|
39 |
+
const [error, setError] = React.useState<string | null>(null);
|
40 |
|
41 |
const config = useLiveQuery(async () => await configManager.getConfig());
|
42 |
const chatSession = useChatSession(id);
|
43 |
const [selectedModel, setSelectedModel, chatHistoryDB] = useSelectedModel(id, config);
|
44 |
|
45 |
+
// Show loading screen during initial config load
|
46 |
+
useEffect(() => {
|
47 |
+
if (!config) {
|
48 |
+
startLoading("Loading configuration...");
|
49 |
+
return;
|
50 |
+
}
|
51 |
+
stopLoading();
|
52 |
+
}, [config, startLoading, stopLoading]);
|
53 |
+
|
54 |
const selectedModelName = React.useMemo(() => (
|
55 |
CHAT_MODELS.find(model => model.model === selectedModel)?.name || "Select a model"
|
56 |
), [selectedModel]);
|
57 |
|
58 |
+
const selectedModelProvider = React.useMemo(() => {
|
59 |
+
const model = CHAT_MODELS.find(model => model.model === selectedModel);
|
60 |
+
return model?.provider;
|
61 |
+
}, [selectedModel]);
|
62 |
+
|
63 |
const handleModelChange = React.useCallback(async (model: string) => {
|
64 |
if (!config) return;
|
65 |
|
|
|
79 |
}
|
80 |
}
|
81 |
setSelectedModel(model);
|
82 |
+
setError(null); // Clear any previous errors when changing models
|
83 |
}, [config, id, setSelectedModel, configManager, chatHistoryDB.sessions]);
|
84 |
|
85 |
const handleSendMessage = React.useCallback(async () => {
|
86 |
+
// Clear any previous errors
|
87 |
+
setError(null);
|
88 |
+
|
89 |
+
// Check if trying to use Ollama when it's not available or not configured
|
90 |
+
if (selectedModelProvider === PROVIDERS.ollama && config) {
|
91 |
+
if (!config.ollama_base_url || config.ollama_base_url.trim() === '') {
|
92 |
+
setError(`Ollama base URL is not configured. Please set a valid URL in the settings.`);
|
93 |
+
return;
|
94 |
+
}
|
95 |
+
|
96 |
+
if (!config.ollama_available) {
|
97 |
+
setError(`Ollama server is not available. Please check your connection to ${config.ollama_base_url}`);
|
98 |
+
return;
|
99 |
+
}
|
100 |
+
}
|
101 |
+
|
102 |
let chatId = id;
|
103 |
let isNewChat = false;
|
104 |
if (id === "new") {
|
105 |
chatId = crypto.randomUUID();
|
|
|
106 |
isNewChat = true;
|
107 |
navigate(`/chat/${chatId}`, { replace: true });
|
108 |
}
|
109 |
+
|
110 |
+
// Reset controller before starting a new chat
|
111 |
+
chatManager.resetController();
|
112 |
+
|
113 |
+
try {
|
114 |
+
await generateMessage(
|
115 |
+
chatId,
|
116 |
+
input,
|
117 |
+
attachments,
|
118 |
+
isGenerating,
|
119 |
+
setIsGenerating,
|
120 |
+
setStreamingHumanMessage,
|
121 |
+
setStreamingAIMessageChunks,
|
122 |
+
chatManager,
|
123 |
+
setInput,
|
124 |
+
setAttachments
|
125 |
);
|
126 |
+
|
127 |
+
if (isNewChat && chatId) {
|
128 |
+
const chatName = await chatManager.chatChain(
|
129 |
+
`Based on this user message, generate a very concise (max 40 chars) but descriptive name for this chat: "${input}"`,
|
130 |
+
"You are a helpful assistant that generates concise chat names. Respond only with the name, no quotes or explanation."
|
131 |
+
);
|
132 |
+
await chatHistoryDB.sessions.update(chatId, {
|
133 |
+
name: String(chatName.content)
|
134 |
+
});
|
135 |
+
}
|
136 |
+
} catch (error) {
|
137 |
+
console.error("Error sending message:", error);
|
138 |
+
if (error instanceof Error) {
|
139 |
+
setError(error.message);
|
140 |
+
} else {
|
141 |
+
setError("An unknown error occurred while sending your message");
|
142 |
+
}
|
143 |
}
|
144 |
+
}, [id, input, attachments, isGenerating, chatManager, navigate, chatHistoryDB.sessions, selectedModelProvider, config]);
|
145 |
|
146 |
const handleAttachmentFileUpload = React.useCallback(async (event: React.ChangeEvent<HTMLInputElement>) => {
|
147 |
const files = event.target.files;
|
|
|
186 |
const handleSaveEdit = React.useCallback(async (content: string) => {
|
187 |
if (!id || editingMessageIndex === null || !chatSession || isGenerating) return;
|
188 |
|
189 |
+
// Clear any previous errors
|
190 |
+
setError(null);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
191 |
|
192 |
+
// Check if trying to use Ollama when it's not available or not configured
|
193 |
+
if (selectedModelProvider === PROVIDERS.ollama && config) {
|
194 |
+
if (!config.ollama_base_url || config.ollama_base_url.trim() === '') {
|
195 |
+
setError(`Ollama base URL is not configured. Please set a valid URL in the settings.`);
|
196 |
+
return;
|
197 |
+
}
|
198 |
+
|
199 |
+
if (!config.ollama_available) {
|
200 |
+
setError(`Ollama server is not available. Please check your connection to ${config.ollama_base_url}`);
|
201 |
+
return;
|
202 |
+
}
|
203 |
+
}
|
204 |
|
205 |
+
try {
|
206 |
+
// Update the message directly in the database
|
207 |
+
const updatedMessages = [...chatSession.messages];
|
208 |
+
updatedMessages[editingMessageIndex] = {
|
209 |
+
...updatedMessages[editingMessageIndex],
|
210 |
+
data: {
|
211 |
+
...updatedMessages[editingMessageIndex].data,
|
212 |
+
content
|
213 |
+
}
|
214 |
+
};
|
215 |
+
// Remove messages after the edited message
|
216 |
+
const newMessages = updatedMessages.slice(0, editingMessageIndex + 1);
|
217 |
+
|
218 |
+
await chatHistoryDB.sessions.update(id, {
|
219 |
+
...chatSession,
|
220 |
+
messages: newMessages,
|
221 |
+
updatedAt: Date.now()
|
222 |
+
});
|
223 |
+
|
224 |
+
setInput(content);
|
225 |
+
setEditingMessageIndex(null);
|
226 |
+
setAttachments([]);
|
227 |
+
|
228 |
+
// Reset controller before regenerating
|
229 |
+
chatManager.resetController();
|
230 |
+
|
231 |
+
await generateMessage(
|
232 |
+
id,
|
233 |
+
content,
|
234 |
+
[],
|
235 |
+
isGenerating,
|
236 |
+
setIsGenerating,
|
237 |
+
setStreamingHumanMessage,
|
238 |
+
setStreamingAIMessageChunks,
|
239 |
+
chatManager,
|
240 |
+
setInput,
|
241 |
+
setAttachments
|
242 |
+
);
|
243 |
+
} catch (error) {
|
244 |
+
console.error("Error editing message:", error);
|
245 |
+
if (error instanceof Error) {
|
246 |
+
setError(error.message);
|
247 |
+
} else {
|
248 |
+
setError("An unknown error occurred while editing your message");
|
249 |
+
}
|
250 |
+
}
|
251 |
+
}, [id, editingMessageIndex, chatSession, isGenerating, chatHistoryDB.sessions, chatManager, selectedModelProvider, config]);
|
252 |
|
253 |
const handleRegenerateMessage = React.useCallback(async (index: number) => {
|
254 |
if (!id || !chatSession || isGenerating) return;
|
255 |
|
256 |
+
// Clear any previous errors
|
257 |
+
setError(null);
|
258 |
|
259 |
+
// Check if trying to use Ollama when it's not available or not configured
|
260 |
+
if (selectedModelProvider === PROVIDERS.ollama && config) {
|
261 |
+
if (!config.ollama_base_url || config.ollama_base_url.trim() === '') {
|
262 |
+
setError(`Ollama base URL is not configured. Please set a valid URL in the settings.`);
|
263 |
+
return;
|
264 |
+
}
|
265 |
+
|
266 |
+
if (!config.ollama_available) {
|
267 |
+
setError(`Ollama server is not available. Please check your connection to ${config.ollama_base_url}`);
|
268 |
+
return;
|
269 |
+
}
|
270 |
+
}
|
271 |
|
272 |
+
try {
|
273 |
+
const messages = chatSession.messages;
|
274 |
+
if (messages.length <= index) return;
|
275 |
+
|
276 |
+
const message = messages[index];
|
277 |
+
const content = message.data.content;
|
278 |
+
|
279 |
+
// Remove messages after the current message
|
280 |
+
const newMessages = messages.slice(0, index + 1);
|
281 |
+
|
282 |
+
await chatHistoryDB.sessions.update(id, {
|
283 |
+
...chatSession,
|
284 |
+
messages: newMessages,
|
285 |
+
updatedAt: Date.now()
|
286 |
+
});
|
287 |
+
|
288 |
+
// Reset controller before regenerating
|
289 |
+
chatManager.resetController();
|
290 |
+
|
291 |
+
await generateMessage(
|
292 |
+
id,
|
293 |
+
content,
|
294 |
+
[],
|
295 |
+
isGenerating,
|
296 |
+
setIsGenerating,
|
297 |
+
setStreamingHumanMessage,
|
298 |
+
setStreamingAIMessageChunks,
|
299 |
+
chatManager,
|
300 |
+
setInput,
|
301 |
+
setAttachments
|
302 |
+
);
|
303 |
+
} catch (error) {
|
304 |
+
console.error("Error regenerating message:", error);
|
305 |
+
if (error instanceof Error) {
|
306 |
+
setError(error.message);
|
307 |
+
} else {
|
308 |
+
setError("An unknown error occurred while regenerating the message");
|
309 |
+
}
|
310 |
+
}
|
311 |
+
}, [id, chatSession, isGenerating, chatHistoryDB.sessions, chatManager, selectedModelProvider, config]);
|
312 |
|
313 |
const stopGenerating = React.useCallback(() => {
|
314 |
chatManager.controller.abort();
|
315 |
setIsGenerating(false);
|
316 |
+
}, [chatManager]);
|
317 |
|
318 |
return (
|
319 |
<div className="flex flex-col h-screen p-2">
|
320 |
+
{error && (
|
321 |
+
<Alert variant="destructive" className="mb-4">
|
322 |
+
<AlertCircle className="h-4 w-4" />
|
323 |
+
<AlertTitle>Error</AlertTitle>
|
324 |
+
<AlertDescription>{error}</AlertDescription>
|
325 |
+
</Alert>
|
326 |
+
)}
|
327 |
<Messages
|
328 |
messages={chatSession?.messages.map(mapStoredMessageToChatMessage)}
|
329 |
streamingHumanMessage={streamingHumanMessage}
|
src/pages/home/components/ChatModels.tsx
ADDED
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { CHAT_MODELS } from "@/lib/config/types";
|
2 |
+
import { ModelList } from "./ModelList";
|
3 |
+
import { ChatModelsProps } from "../types";
|
4 |
+
|
5 |
+
export const ChatModels = ({ config }: ChatModelsProps) => {
|
6 |
+
if (!config) return null;
|
7 |
+
|
8 |
+
return (
|
9 |
+
<ModelList
|
10 |
+
type="chat"
|
11 |
+
models={CHAT_MODELS}
|
12 |
+
config={config}
|
13 |
+
enabledModels={config.enabled_chat_models || []}
|
14 |
+
defaultModel={config.default_chat_model}
|
15 |
+
title="Chat Models"
|
16 |
+
description="Enable or disable available chat models"
|
17 |
+
/>
|
18 |
+
);
|
19 |
+
}
|
src/pages/home/components/EmbeddingModels.tsx
ADDED
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { EMBEDDING_MODELS } from "@/lib/config/types";
|
2 |
+
import { ModelList } from "./ModelList";
|
3 |
+
import { EmbeddingModelsProps } from "../types";
|
4 |
+
|
5 |
+
export const EmbeddingModels = ({ config }: EmbeddingModelsProps) => {
|
6 |
+
if (!config) return null;
|
7 |
+
|
8 |
+
return (
|
9 |
+
<ModelList
|
10 |
+
type="embedding"
|
11 |
+
models={EMBEDDING_MODELS}
|
12 |
+
config={config}
|
13 |
+
enabledModels={config.enabled_embedding_models || []}
|
14 |
+
defaultModel={config.default_embedding_model}
|
15 |
+
title="Embedding Models"
|
16 |
+
description="Enable or disable available embedding models"
|
17 |
+
/>
|
18 |
+
);
|
19 |
+
}
|
src/pages/home/components/ModelList.tsx
ADDED
@@ -0,0 +1,165 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { useState } from "react";
|
2 |
+
import { PROVIDERS, PROVIDERS_CONN_ARGS_MAP, ChatModel, BaseModel } from "@/lib/config/types";
|
3 |
+
import { Card, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
4 |
+
import { Badge } from "@/components/ui/badge";
|
5 |
+
import { Switch } from "@/components/ui/switch";
|
6 |
+
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
7 |
+
import { Button } from "@/components/ui/button";
|
8 |
+
import { ScrollArea } from "@/components/ui/scroll-area";
|
9 |
+
import { useUpdateConfig, useUpdateEnabledModels } from "../hooks";
|
10 |
+
import { ModelListProps } from "../types";
|
11 |
+
|
12 |
+
export const ModelList = ({ type, models, config, enabledModels, defaultModel, title, description }: ModelListProps) => {
|
13 |
+
const updateEnabledModels = useUpdateEnabledModels(type);
|
14 |
+
const updateConfig = useUpdateConfig();
|
15 |
+
const [selectedDefault, setSelectedDefault] = useState(defaultModel);
|
16 |
+
|
17 |
+
const handleModelToggle = async (model: BaseModel, enabled: boolean) => {
|
18 |
+
await updateEnabledModels.mutateAsync({
|
19 |
+
modelId: model.model,
|
20 |
+
enabled,
|
21 |
+
});
|
22 |
+
};
|
23 |
+
|
24 |
+
const handleDefaultModelChange = async (modelId: string) => {
|
25 |
+
setSelectedDefault(modelId);
|
26 |
+
await updateConfig.mutateAsync({
|
27 |
+
[type === 'chat' ? 'default_chat_model' : 'default_embedding_model']: modelId,
|
28 |
+
});
|
29 |
+
};
|
30 |
+
|
31 |
+
if (!config) return null;
|
32 |
+
|
33 |
+
const isProviderConfigured = (provider: PROVIDERS) => {
|
34 |
+
// Special case for OpenAI - only require API key
|
35 |
+
if (provider === PROVIDERS.openai) {
|
36 |
+
return config.openai_api_key && config.openai_api_key.trim().length > 0;
|
37 |
+
}
|
38 |
+
|
39 |
+
// For other providers, check all required fields
|
40 |
+
return PROVIDERS_CONN_ARGS_MAP[provider].every(
|
41 |
+
(arg) => {
|
42 |
+
const value = config[arg as keyof typeof config];
|
43 |
+
return typeof value === 'string' && value.trim().length > 0;
|
44 |
+
}
|
45 |
+
);
|
46 |
+
};
|
47 |
+
|
48 |
+
const handleSelectAll = async () => {
|
49 |
+
for (const model of models) {
|
50 |
+
if (isProviderConfigured(model.provider) && !enabledModels.includes(model.model)) {
|
51 |
+
await updateEnabledModels.mutateAsync({ modelId: model.model, enabled: true });
|
52 |
+
}
|
53 |
+
}
|
54 |
+
};
|
55 |
+
|
56 |
+
const handleUnselectAll = async () => {
|
57 |
+
for (const model of models) {
|
58 |
+
if (enabledModels.includes(model.model)) {
|
59 |
+
await updateEnabledModels.mutateAsync({ modelId: model.model, enabled: false });
|
60 |
+
}
|
61 |
+
}
|
62 |
+
// Then clear the default model
|
63 |
+
updateConfig.mutate({ [`default_${type}_model`]: '' });
|
64 |
+
};
|
65 |
+
|
66 |
+
return (
|
67 |
+
<Card className="h-[calc(100vh-12rem)]">
|
68 |
+
<CardHeader className="space-y-6">
|
69 |
+
<div className="flex items-center justify-between">
|
70 |
+
<div>
|
71 |
+
<CardTitle>{title}</CardTitle>
|
72 |
+
<CardDescription>{description}</CardDescription>
|
73 |
+
</div>
|
74 |
+
<div className="flex gap-2">
|
75 |
+
<Button
|
76 |
+
variant="outline"
|
77 |
+
size="sm"
|
78 |
+
onClick={handleSelectAll}
|
79 |
+
disabled={models.length === 0 || updateConfig.isPending}
|
80 |
+
>
|
81 |
+
Select All
|
82 |
+
</Button>
|
83 |
+
<Button
|
84 |
+
variant="outline"
|
85 |
+
size="sm"
|
86 |
+
onClick={handleUnselectAll}
|
87 |
+
disabled={enabledModels.length === 0 || updateConfig.isPending}
|
88 |
+
>
|
89 |
+
Unselect All
|
90 |
+
</Button>
|
91 |
+
</div>
|
92 |
+
</div>
|
93 |
+
<div>
|
94 |
+
<Select value={selectedDefault} onValueChange={handleDefaultModelChange}>
|
95 |
+
<SelectTrigger className="w-[200px]">
|
96 |
+
<SelectValue placeholder="Select default model" />
|
97 |
+
</SelectTrigger>
|
98 |
+
<SelectContent>
|
99 |
+
{models.filter(model => enabledModels.includes(model.model)).map(model => (
|
100 |
+
<SelectItem key={model.model} value={model.model}>
|
101 |
+
{model.name}
|
102 |
+
</SelectItem>
|
103 |
+
))}
|
104 |
+
</SelectContent>
|
105 |
+
</Select>
|
106 |
+
{enabledModels.length === 0 && (
|
107 |
+
<p className="text-sm text-muted-foreground mt-2">
|
108 |
+
Enable at least one model to set as default
|
109 |
+
</p>
|
110 |
+
)}
|
111 |
+
</div>
|
112 |
+
</CardHeader>
|
113 |
+
<ScrollArea className="h-[calc(100%-13rem)] px-6">
|
114 |
+
<div className="space-y-4 pb-6">
|
115 |
+
{models.map((model) => {
|
116 |
+
const providerConfigured = isProviderConfigured(model.provider);
|
117 |
+
const isChatModel = (m: BaseModel): m is ChatModel => 'modalities' in m;
|
118 |
+
|
119 |
+
return (
|
120 |
+
<div key={model.model} className={`p-4 rounded-lg border ${providerConfigured ? '' : 'opacity-50'}`}>
|
121 |
+
<div className="flex items-start justify-between">
|
122 |
+
<div className="space-y-1">
|
123 |
+
<h3 className="font-medium">{model.name}</h3>
|
124 |
+
<p className="text-sm text-muted-foreground">{model.description}</p>
|
125 |
+
{!providerConfigured && (
|
126 |
+
<p className="text-sm text-muted-foreground">
|
127 |
+
Configure {model.provider} provider first
|
128 |
+
</p>
|
129 |
+
)}
|
130 |
+
{isChatModel(model) && model.modalities && model.modalities.length > 0 && (
|
131 |
+
<div className="flex gap-2 flex-wrap pt-2">
|
132 |
+
{model.modalities.map((modality) => (
|
133 |
+
<Badge key={modality} variant="outline" className="capitalize">
|
134 |
+
{modality.toLowerCase()}
|
135 |
+
</Badge>
|
136 |
+
))}
|
137 |
+
</div>
|
138 |
+
)}
|
139 |
+
{isChatModel(model) && model.isReasoning && (
|
140 |
+
<div className="flex gap-2 flex-wrap pt-2">
|
141 |
+
<Badge variant="outline" className="bg-purple-50 text-purple-700 border-purple-200 dark:bg-purple-950 dark:text-purple-300 dark:border-purple-800">
|
142 |
+
Thinking
|
143 |
+
</Badge>
|
144 |
+
</div>
|
145 |
+
)}
|
146 |
+
</div>
|
147 |
+
<Switch
|
148 |
+
checked={enabledModels.includes(model.model)}
|
149 |
+
onCheckedChange={(checked) => {
|
150 |
+
handleModelToggle(model, checked);
|
151 |
+
if (!checked && selectedDefault === model.model) {
|
152 |
+
handleDefaultModelChange('');
|
153 |
+
}
|
154 |
+
}}
|
155 |
+
disabled={!providerConfigured || updateEnabledModels.isPending}
|
156 |
+
/>
|
157 |
+
</div>
|
158 |
+
</div>
|
159 |
+
);
|
160 |
+
})}
|
161 |
+
</div>
|
162 |
+
</ScrollArea>
|
163 |
+
</Card>
|
164 |
+
);
|
165 |
+
}
|
src/pages/home/components/Others.tsx
ADDED
@@ -0,0 +1,25 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
2 |
+
import { OthersProps } from "../types";
|
3 |
+
|
4 |
+
export const Others = ({ config }: OthersProps) => {
|
5 |
+
if (!config) return null;
|
6 |
+
|
7 |
+
return (
|
8 |
+
<Card className="h-[calc(100vh-12rem)]">
|
9 |
+
<CardHeader>
|
10 |
+
<CardTitle>Other Settings</CardTitle>
|
11 |
+
<CardDescription>Additional configuration options</CardDescription>
|
12 |
+
</CardHeader>
|
13 |
+
<CardContent className="space-y-8 px-6">
|
14 |
+
<div className="space-y-4">
|
15 |
+
<div className="flex items-center justify-between">
|
16 |
+
<div>
|
17 |
+
<h3 className="text-lg font-semibold">Integrations</h3>
|
18 |
+
<p className="text-sm text-muted-foreground">Connect to external services like Hugging Face to use their inference endpoints</p>
|
19 |
+
</div>
|
20 |
+
</div>
|
21 |
+
</div>
|
22 |
+
</CardContent>
|
23 |
+
</Card>
|
24 |
+
);
|
25 |
+
}
|
src/pages/home/components/Providers.tsx
ADDED
@@ -0,0 +1,262 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { useState } from "react";
|
2 |
+
import { PROVIDERS, PROVIDERS_CONN_ARGS_MAP } from "@/lib/config/types";
|
3 |
+
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
4 |
+
import { Input } from "@/components/ui/input";
|
5 |
+
import { ScrollArea } from "@/components/ui/scroll-area";
|
6 |
+
import { Badge } from "@/components/ui/badge";
|
7 |
+
import { Alert, AlertTitle, AlertDescription } from "@/components/ui/alert";
|
8 |
+
import { AlertCircle, CheckCircle2, ChevronDown } from "lucide-react";
|
9 |
+
import { useUpdateConfig, useDebounceCallback } from "../hooks";
|
10 |
+
import { ProvidersProps } from "../types";
|
11 |
+
import { Collapsible, CollapsibleContent, CollapsibleTrigger } from "@/components/ui/collapsible";
|
12 |
+
import { Button } from "@/components/ui/button";
|
13 |
+
import { toast } from "sonner";
|
14 |
+
|
15 |
+
export const Providers = ({ config }: ProvidersProps) => {
|
16 |
+
const updateConfig = useUpdateConfig();
|
17 |
+
const [localValues, setLocalValues] = useState<Record<string, string>>({});
|
18 |
+
const [openCollapsible, setOpenCollapsible] = useState(false);
|
19 |
+
const debouncedUpdateConfig = useDebounceCallback((updates: Record<string, string>) => {
|
20 |
+
updateConfig.mutate(updates);
|
21 |
+
}, 500);
|
22 |
+
|
23 |
+
if (!config) return null;
|
24 |
+
|
25 |
+
const handleHuggingFaceOAuth = () => {
|
26 |
+
const clientId = import.meta.env.VITE_HF_CLIENT_ID;
|
27 |
+
const redirectUri = import.meta.env.VITE_HF_REDIRECT_URI;
|
28 |
+
|
29 |
+
if (!clientId || !redirectUri) {
|
30 |
+
toast.error("HuggingFace OAuth configuration is missing");
|
31 |
+
return;
|
32 |
+
}
|
33 |
+
|
34 |
+
const state = Math.random().toString(36).substring(2);
|
35 |
+
const authUrl = `https://huggingface.co/oauth/authorize?client_id=${clientId}&redirect_uri=${redirectUri}&response_type=code&scope=openid%20inference-api&prompt=consent&state=${state}`;
|
36 |
+
|
37 |
+
window.location.href = authUrl;
|
38 |
+
};
|
39 |
+
|
40 |
+
return (
|
41 |
+
<Card className="h-[calc(100vh-12rem)]">
|
42 |
+
<CardHeader>
|
43 |
+
<CardTitle>Providers</CardTitle>
|
44 |
+
<CardDescription>Configure your AI provider credentials</CardDescription>
|
45 |
+
</CardHeader>
|
46 |
+
<ScrollArea className="h-[calc(100%-8rem)]">
|
47 |
+
<CardContent className="space-y-8 px-6">
|
48 |
+
{Object.values(PROVIDERS).map((provider) => (
|
49 |
+
<div key={provider} className="space-y-4">
|
50 |
+
<div className="flex items-center justify-between">
|
51 |
+
<h3 className="text-lg font-semibold capitalize">{provider}</h3>
|
52 |
+
{provider === PROVIDERS.ollama && (
|
53 |
+
<div className="flex items-center gap-2">
|
54 |
+
{!config.ollama_base_url || config.ollama_base_url.trim() === '' ? (
|
55 |
+
<Badge variant="outline" className="bg-yellow-50 text-yellow-700 border-yellow-200 dark:bg-yellow-950 dark:text-yellow-300 dark:border-yellow-800">
|
56 |
+
<AlertCircle className="h-3.5 w-3.5 mr-1" />
|
57 |
+
Not Configured
|
58 |
+
</Badge>
|
59 |
+
) : config.ollama_available ? (
|
60 |
+
<Badge variant="outline" className="bg-green-50 text-green-700 border-green-200 dark:bg-green-950 dark:text-green-300 dark:border-green-800">
|
61 |
+
<CheckCircle2 className="h-3.5 w-3.5 mr-1" />
|
62 |
+
Connected
|
63 |
+
</Badge>
|
64 |
+
) : (
|
65 |
+
<Badge variant="outline" className="bg-red-50 text-red-700 border-red-200 dark:bg-red-950 dark:text-red-300 dark:border-red-800">
|
66 |
+
<AlertCircle className="h-3.5 w-3.5 mr-1" />
|
67 |
+
Not Connected
|
68 |
+
</Badge>
|
69 |
+
)}
|
70 |
+
</div>
|
71 |
+
)}
|
72 |
+
{provider === PROVIDERS.huggingface && config?.hf_token && (
|
73 |
+
<Badge>Connected</Badge>
|
74 |
+
)}
|
75 |
+
</div>
|
76 |
+
<div className="grid gap-4">
|
77 |
+
{provider === PROVIDERS.openai ? (
|
78 |
+
<>
|
79 |
+
{/* Required OpenAI fields */}
|
80 |
+
{PROVIDERS_CONN_ARGS_MAP[provider]
|
81 |
+
.filter(arg => arg === 'openai_api_key')
|
82 |
+
.map((arg) => {
|
83 |
+
const value = localValues[arg] ?? config[arg as keyof typeof config] ?? '';
|
84 |
+
|
85 |
+
return (
|
86 |
+
<div key={arg} className="space-y-2">
|
87 |
+
<label htmlFor={arg} className="text-sm font-medium flex items-center">
|
88 |
+
{arg.replace(/_/g, ' ').replace(/url/i, 'URL')}
|
89 |
+
</label>
|
90 |
+
<Input
|
91 |
+
id={arg}
|
92 |
+
type="text"
|
93 |
+
value={value}
|
94 |
+
onChange={(e) => {
|
95 |
+
const newValue = e.target.value;
|
96 |
+
setLocalValues(prev => ({ ...prev, [arg]: newValue }));
|
97 |
+
debouncedUpdateConfig({ [arg]: newValue });
|
98 |
+
}}
|
99 |
+
disabled={updateConfig.isPending}
|
100 |
+
className="font-mono"
|
101 |
+
/>
|
102 |
+
</div>
|
103 |
+
);
|
104 |
+
})}
|
105 |
+
|
106 |
+
{/* Optional OpenAI fields in Collapsible */}
|
107 |
+
<Collapsible
|
108 |
+
open={openCollapsible}
|
109 |
+
onOpenChange={setOpenCollapsible}
|
110 |
+
className="mt-2 space-y-2 border rounded-md p-2"
|
111 |
+
>
|
112 |
+
<div className="flex items-center justify-between">
|
113 |
+
<h4 className="text-sm font-medium text-muted-foreground">Advanced Options</h4>
|
114 |
+
<CollapsibleTrigger asChild>
|
115 |
+
<Button variant="ghost" size="sm" className="p-0 h-8 w-8">
|
116 |
+
<ChevronDown className={`h-4 w-4 transition-transform ${openCollapsible ? "transform rotate-180" : ""}`} />
|
117 |
+
<span className="sr-only">Toggle advanced options</span>
|
118 |
+
</Button>
|
119 |
+
</CollapsibleTrigger>
|
120 |
+
</div>
|
121 |
+
<CollapsibleContent className="space-y-4 pt-2">
|
122 |
+
{PROVIDERS_CONN_ARGS_MAP[provider]
|
123 |
+
.filter(arg => arg === 'openai_base_url' || arg === 'openai_model')
|
124 |
+
.map((arg) => {
|
125 |
+
const value = localValues[arg] ?? config[arg as keyof typeof config] ?? '';
|
126 |
+
|
127 |
+
return (
|
128 |
+
<div key={arg} className="space-y-2">
|
129 |
+
<label htmlFor={arg} className="text-sm font-medium flex items-center">
|
130 |
+
{arg.replace(/_/g, ' ').replace(/url/i, 'URL')}
|
131 |
+
<Badge variant="outline" className="ml-2 text-xs font-normal bg-gray-50 text-gray-500 border-gray-200 dark:bg-gray-800 dark:text-gray-400 dark:border-gray-700">
|
132 |
+
Optional
|
133 |
+
</Badge>
|
134 |
+
</label>
|
135 |
+
<Input
|
136 |
+
id={arg}
|
137 |
+
type="text"
|
138 |
+
value={value}
|
139 |
+
onChange={(e) => {
|
140 |
+
const newValue = e.target.value;
|
141 |
+
setLocalValues(prev => ({ ...prev, [arg]: newValue }));
|
142 |
+
debouncedUpdateConfig({ [arg]: newValue });
|
143 |
+
}}
|
144 |
+
disabled={updateConfig.isPending}
|
145 |
+
className="font-mono"
|
146 |
+
/>
|
147 |
+
{arg === 'openai_model' && (
|
148 |
+
<p className="text-xs text-muted-foreground mt-1">
|
149 |
+
Enter comma-separated model names to add multiple custom models (e.g. "gpt-4-turbo,gpt-4-vision")
|
150 |
+
</p>
|
151 |
+
)}
|
152 |
+
</div>
|
153 |
+
);
|
154 |
+
})}
|
155 |
+
</CollapsibleContent>
|
156 |
+
</Collapsible>
|
157 |
+
</>
|
158 |
+
) : provider === PROVIDERS.huggingface ? (
|
159 |
+
<>
|
160 |
+
<div className="space-y-2">
|
161 |
+
<label htmlFor="hf_token" className="text-sm font-medium flex items-center">
|
162 |
+
API Token
|
163 |
+
</label>
|
164 |
+
<div className="flex gap-2">
|
165 |
+
<Input
|
166 |
+
id="hf_token"
|
167 |
+
type="password"
|
168 |
+
placeholder="Enter your Hugging Face API token"
|
169 |
+
value={localValues.hf_token ?? config?.hf_token ?? ''}
|
170 |
+
onChange={(e) => {
|
171 |
+
const newValue = e.target.value;
|
172 |
+
setLocalValues(prev => ({ ...prev, hf_token: newValue }));
|
173 |
+
debouncedUpdateConfig({ hf_token: newValue });
|
174 |
+
}}
|
175 |
+
disabled={updateConfig.isPending}
|
176 |
+
className="font-mono"
|
177 |
+
/>
|
178 |
+
<Button
|
179 |
+
variant="outline"
|
180 |
+
onClick={handleHuggingFaceOAuth}
|
181 |
+
disabled={updateConfig.isPending}
|
182 |
+
>
|
183 |
+
Connect
|
184 |
+
</Button>
|
185 |
+
</div>
|
186 |
+
</div>
|
187 |
+
<div className="space-y-2">
|
188 |
+
<label htmlFor="hf_custom_models" className="text-sm font-medium flex items-center">
|
189 |
+
Custom Models (comma-separated)
|
190 |
+
</label>
|
191 |
+
<Input
|
192 |
+
id="hf_custom_models"
|
193 |
+
placeholder="e.g. Qwen/Qwen2-VL-7B-Instruct,mistralai/Mixtral-8x7B-Instruct-v0.1"
|
194 |
+
value={localValues.hf_custom_models ?? config?.hf_custom_models ?? ''}
|
195 |
+
onChange={(e) => {
|
196 |
+
const newValue = e.target.value;
|
197 |
+
setLocalValues(prev => ({ ...prev, hf_custom_models: newValue }));
|
198 |
+
debouncedUpdateConfig({ hf_custom_models: newValue });
|
199 |
+
}}
|
200 |
+
disabled={updateConfig.isPending}
|
201 |
+
className="font-mono"
|
202 |
+
/>
|
203 |
+
<p className="text-xs text-muted-foreground">
|
204 |
+
Add custom models in the format: owner/model-name
|
205 |
+
</p>
|
206 |
+
</div>
|
207 |
+
</>
|
208 |
+
) : (
|
209 |
+
// Non-OpenAI providers
|
210 |
+
PROVIDERS_CONN_ARGS_MAP[provider].map((arg) => {
|
211 |
+
const value = localValues[arg] ?? config[arg as keyof typeof config] ?? '';
|
212 |
+
|
213 |
+
return (
|
214 |
+
<div key={arg} className="space-y-2">
|
215 |
+
<label htmlFor={arg} className="text-sm font-medium flex items-center">
|
216 |
+
{arg.replace(/_/g, ' ').replace(/url/i, 'URL')}
|
217 |
+
</label>
|
218 |
+
<Input
|
219 |
+
id={arg}
|
220 |
+
type="text"
|
221 |
+
value={value}
|
222 |
+
onChange={(e) => {
|
223 |
+
const newValue = e.target.value;
|
224 |
+
setLocalValues(prev => ({ ...prev, [arg]: newValue }));
|
225 |
+
debouncedUpdateConfig({ [arg]: newValue });
|
226 |
+
}}
|
227 |
+
disabled={updateConfig.isPending}
|
228 |
+
className="font-mono"
|
229 |
+
/>
|
230 |
+
</div>
|
231 |
+
);
|
232 |
+
})
|
233 |
+
)}
|
234 |
+
</div>
|
235 |
+
{provider === PROVIDERS.ollama && (
|
236 |
+
<>
|
237 |
+
{!config.ollama_base_url || config.ollama_base_url.trim() === '' ? (
|
238 |
+
<Alert className="mt-4 border-yellow-200 text-yellow-800 dark:border-yellow-800 dark:text-yellow-300">
|
239 |
+
<AlertCircle className="h-4 w-4" />
|
240 |
+
<AlertTitle>Ollama Not Configured</AlertTitle>
|
241 |
+
<AlertDescription>
|
242 |
+
Please enter a valid Ollama server URL to enable Ollama models.
|
243 |
+
</AlertDescription>
|
244 |
+
</Alert>
|
245 |
+
) : !config.ollama_available && (
|
246 |
+
<Alert variant="destructive" className="mt-4">
|
247 |
+
<AlertCircle className="h-4 w-4" />
|
248 |
+
<AlertTitle>Connection Error</AlertTitle>
|
249 |
+
<AlertDescription>
|
250 |
+
Could not connect to Ollama server at {config.ollama_base_url}. Please check that Ollama is running and the URL is correct.
|
251 |
+
</AlertDescription>
|
252 |
+
</Alert>
|
253 |
+
)}
|
254 |
+
</>
|
255 |
+
)}
|
256 |
+
</div>
|
257 |
+
))}
|
258 |
+
</CardContent>
|
259 |
+
</ScrollArea>
|
260 |
+
</Card>
|
261 |
+
);
|
262 |
+
}
|
src/pages/home/hooks.ts
ADDED
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { useConfig, useUpdateConfig, useUpdateEnabledModels } from "@/hooks/use-config";
|
2 |
+
import { useDebounceCallback } from "@/hooks/use-debounce";
|
3 |
+
|
4 |
+
// Re-export the hooks
|
5 |
+
export { useConfig, useUpdateConfig, useUpdateEnabledModels, useDebounceCallback };
|
src/pages/home/page.tsx
CHANGED
@@ -1,271 +1,28 @@
|
|
1 |
-
import {
|
2 |
-
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
3 |
-
import { Input } from "@/components/ui/input";
|
4 |
import { Tabs, TabsList, TabsTrigger, TabsContent } from "@/components/ui/tabs";
|
5 |
-
import {
|
6 |
-
import {
|
7 |
-
import {
|
8 |
-
import {
|
9 |
-
import {
|
10 |
-
import {
|
11 |
-
import { useDebounceCallback } from "@/hooks/use-debounce";
|
12 |
-
import { useState } from "react";
|
13 |
-
|
14 |
-
const Providers = ({ config }: { config: IConfig | undefined }) => {
|
15 |
-
const updateConfig = useUpdateConfig();
|
16 |
-
const [localValues, setLocalValues] = useState<Record<string, string>>({});
|
17 |
-
const debouncedUpdateConfig = useDebounceCallback((updates: Record<string, string>) => {
|
18 |
-
updateConfig.mutate(updates);
|
19 |
-
}, 500);
|
20 |
-
|
21 |
-
if (!config) return null;
|
22 |
-
|
23 |
-
return (
|
24 |
-
<Card className="h-[calc(100vh-12rem)]">
|
25 |
-
<CardHeader>
|
26 |
-
<CardTitle>Providers</CardTitle>
|
27 |
-
<CardDescription>Configure your AI provider credentials</CardDescription>
|
28 |
-
</CardHeader>
|
29 |
-
<ScrollArea className="h-[calc(100%-8rem)]">
|
30 |
-
<CardContent className="space-y-8 px-6">
|
31 |
-
{Object.values(PROVIDERS).map((provider) => (
|
32 |
-
<div key={provider} className="space-y-4">
|
33 |
-
<h3 className="text-lg font-semibold capitalize">{provider}</h3>
|
34 |
-
<div className="grid gap-4">
|
35 |
-
{PROVIDERS_CONN_ARGS_MAP[provider].map((arg) => {
|
36 |
-
const value = localValues[arg] ?? config[arg as keyof IConfig] ?? '';
|
37 |
-
return (
|
38 |
-
<div key={arg} className="space-y-2">
|
39 |
-
<label htmlFor={arg} className="text-sm font-medium">{arg.replace(/_/g, ' ').replace(/url/i, 'URL')}</label>
|
40 |
-
<Input
|
41 |
-
id={arg}
|
42 |
-
type="text"
|
43 |
-
value={value}
|
44 |
-
onChange={(e) => {
|
45 |
-
const newValue = e.target.value;
|
46 |
-
setLocalValues(prev => ({ ...prev, [arg]: newValue }));
|
47 |
-
debouncedUpdateConfig({ [arg]: newValue });
|
48 |
-
}}
|
49 |
-
disabled={updateConfig.isPending}
|
50 |
-
className="font-mono"
|
51 |
-
/>
|
52 |
-
</div>
|
53 |
-
);
|
54 |
-
})}
|
55 |
-
</div>
|
56 |
-
</div>
|
57 |
-
))}
|
58 |
-
</CardContent>
|
59 |
-
</ScrollArea>
|
60 |
-
</Card>
|
61 |
-
);
|
62 |
-
}
|
63 |
-
|
64 |
-
interface ModelListProps {
|
65 |
-
type: 'chat' | 'embedding';
|
66 |
-
models: BaseModel[];
|
67 |
-
config: IConfig | undefined;
|
68 |
-
enabledModels: string[];
|
69 |
-
defaultModel: string;
|
70 |
-
title: string;
|
71 |
-
description: string;
|
72 |
-
}
|
73 |
-
|
74 |
-
const ModelList = ({ type, models, config, enabledModels, defaultModel, title, description }: ModelListProps) => {
|
75 |
-
const updateEnabledModels = useUpdateEnabledModels(type);
|
76 |
-
const updateConfig = useUpdateConfig();
|
77 |
-
const [selectedDefault, setSelectedDefault] = useState(defaultModel);
|
78 |
-
|
79 |
-
const handleModelToggle = async (model: BaseModel, enabled: boolean) => {
|
80 |
-
await updateEnabledModels.mutateAsync({
|
81 |
-
modelId: model.model,
|
82 |
-
enabled,
|
83 |
-
});
|
84 |
-
};
|
85 |
-
|
86 |
-
const handleDefaultModelChange = async (modelId: string) => {
|
87 |
-
setSelectedDefault(modelId);
|
88 |
-
await updateConfig.mutateAsync({
|
89 |
-
[type === 'chat' ? 'default_chat_model' : 'default_embedding_model']: modelId,
|
90 |
-
});
|
91 |
-
};
|
92 |
-
|
93 |
-
if (!config) return null;
|
94 |
-
|
95 |
-
const isProviderConfigured = (provider: PROVIDERS) => {
|
96 |
-
return PROVIDERS_CONN_ARGS_MAP[provider].every(
|
97 |
-
(arg) => {
|
98 |
-
const value = config[arg as keyof IConfig];
|
99 |
-
return typeof value === 'string' && value.trim().length > 0;
|
100 |
-
}
|
101 |
-
);
|
102 |
-
};
|
103 |
-
|
104 |
-
const handleSelectAll = async () => {
|
105 |
-
for (const model of models) {
|
106 |
-
if (isProviderConfigured(model.provider) && !enabledModels.includes(model.model)) {
|
107 |
-
await updateEnabledModels.mutateAsync({ modelId: model.model, enabled: true });
|
108 |
-
}
|
109 |
-
}
|
110 |
-
};
|
111 |
-
|
112 |
-
const handleUnselectAll = async () => {
|
113 |
-
for (const model of models) {
|
114 |
-
if (enabledModels.includes(model.model)) {
|
115 |
-
await updateEnabledModels.mutateAsync({ modelId: model.model, enabled: false });
|
116 |
-
}
|
117 |
-
}
|
118 |
-
// Then clear the default model
|
119 |
-
updateConfig.mutate({ [`default_${type}_model`]: '' });
|
120 |
-
};
|
121 |
-
|
122 |
-
return (
|
123 |
-
<Card className="h-[calc(100vh-12rem)]">
|
124 |
-
<CardHeader className="space-y-6">
|
125 |
-
<div className="flex items-center justify-between">
|
126 |
-
<div>
|
127 |
-
<CardTitle>{title}</CardTitle>
|
128 |
-
<CardDescription>{description}</CardDescription>
|
129 |
-
</div>
|
130 |
-
<div className="flex gap-2">
|
131 |
-
<Button
|
132 |
-
variant="outline"
|
133 |
-
size="sm"
|
134 |
-
onClick={handleSelectAll}
|
135 |
-
disabled={models.length === 0 || updateConfig.isPending}
|
136 |
-
>
|
137 |
-
Select All
|
138 |
-
</Button>
|
139 |
-
<Button
|
140 |
-
variant="outline"
|
141 |
-
size="sm"
|
142 |
-
onClick={handleUnselectAll}
|
143 |
-
disabled={enabledModels.length === 0 || updateConfig.isPending}
|
144 |
-
>
|
145 |
-
Unselect All
|
146 |
-
</Button>
|
147 |
-
</div>
|
148 |
-
</div>
|
149 |
-
<div>
|
150 |
-
<Select value={selectedDefault} onValueChange={handleDefaultModelChange}>
|
151 |
-
<SelectTrigger className="w-[200px]">
|
152 |
-
<SelectValue placeholder="Select default model" />
|
153 |
-
</SelectTrigger>
|
154 |
-
<SelectContent>
|
155 |
-
{models.filter(model => enabledModels.includes(model.model)).map(model => (
|
156 |
-
<SelectItem key={model.model} value={model.model}>
|
157 |
-
{model.name}
|
158 |
-
</SelectItem>
|
159 |
-
))}
|
160 |
-
</SelectContent>
|
161 |
-
</Select>
|
162 |
-
{enabledModels.length === 0 && (
|
163 |
-
<p className="text-sm text-muted-foreground mt-2">
|
164 |
-
Enable at least one model to set as default
|
165 |
-
</p>
|
166 |
-
)}
|
167 |
-
</div>
|
168 |
-
</CardHeader>
|
169 |
-
<ScrollArea className="h-[calc(100%-13rem)] px-6">
|
170 |
-
<div className="space-y-4 pb-6">
|
171 |
-
{models.map((model) => {
|
172 |
-
const providerConfigured = isProviderConfigured(model.provider);
|
173 |
-
const isChatModel = (m: BaseModel): m is ChatModel => 'modalities' in m;
|
174 |
-
|
175 |
-
return (
|
176 |
-
<div key={model.model} className={`p-4 rounded-lg border ${providerConfigured ? '' : 'opacity-50'}`}>
|
177 |
-
<div className="flex items-start justify-between">
|
178 |
-
<div className="space-y-1">
|
179 |
-
<h3 className="font-medium">{model.name}</h3>
|
180 |
-
<p className="text-sm text-muted-foreground">{model.description}</p>
|
181 |
-
{!providerConfigured && (
|
182 |
-
<p className="text-sm text-muted-foreground">
|
183 |
-
Configure {model.provider} provider first
|
184 |
-
</p>
|
185 |
-
)}
|
186 |
-
{isChatModel(model) && model.modalities && model.modalities.length > 0 && (
|
187 |
-
<div className="flex gap-2 flex-wrap pt-2">
|
188 |
-
{model.modalities.map((modality) => (
|
189 |
-
<Badge key={modality} variant="outline" className="capitalize">
|
190 |
-
{modality.toLowerCase()}
|
191 |
-
</Badge>
|
192 |
-
))}
|
193 |
-
</div>
|
194 |
-
)}
|
195 |
-
</div>
|
196 |
-
<Switch
|
197 |
-
checked={enabledModels.includes(model.model)}
|
198 |
-
onCheckedChange={(checked) => {
|
199 |
-
handleModelToggle(model, checked);
|
200 |
-
if (!checked && selectedDefault === model.model) {
|
201 |
-
handleDefaultModelChange('');
|
202 |
-
}
|
203 |
-
}}
|
204 |
-
disabled={!providerConfigured || updateEnabledModels.isPending}
|
205 |
-
/>
|
206 |
-
</div>
|
207 |
-
</div>
|
208 |
-
);
|
209 |
-
})}
|
210 |
-
</div>
|
211 |
-
</ScrollArea>
|
212 |
-
</Card>
|
213 |
-
);
|
214 |
-
}
|
215 |
-
|
216 |
-
const ChatModels = ({ config }: { config: IConfig | undefined }) => {
|
217 |
-
if (!config) return null;
|
218 |
-
|
219 |
-
return (
|
220 |
-
<ModelList
|
221 |
-
type="chat"
|
222 |
-
models={CHAT_MODELS}
|
223 |
-
config={config}
|
224 |
-
enabledModels={config.enabled_chat_models || []}
|
225 |
-
defaultModel={config.default_chat_model}
|
226 |
-
title="Chat Models"
|
227 |
-
description="Enable or disable available chat models"
|
228 |
-
/>
|
229 |
-
);
|
230 |
-
}
|
231 |
-
|
232 |
-
const EmbeddingModels = ({ config }: { config: IConfig | undefined }) => {
|
233 |
-
if (!config) return null;
|
234 |
-
|
235 |
-
return (
|
236 |
-
<ModelList
|
237 |
-
type="embedding"
|
238 |
-
models={EMBEDDING_MODELS}
|
239 |
-
config={config}
|
240 |
-
enabledModels={config.enabled_embedding_models || []}
|
241 |
-
defaultModel={config.default_embedding_model}
|
242 |
-
title="Embedding Models"
|
243 |
-
description="Enable or disable available embedding models"
|
244 |
-
/>
|
245 |
-
);
|
246 |
-
}
|
247 |
-
|
248 |
-
const Others = () => {
|
249 |
-
return (
|
250 |
-
<div>
|
251 |
-
<h1>Others</h1>
|
252 |
-
</div>
|
253 |
-
);
|
254 |
-
}
|
255 |
|
256 |
export function HomePage() {
|
257 |
const { data: config, isLoading, error } = useConfig();
|
258 |
-
|
259 |
-
|
260 |
-
|
261 |
-
|
262 |
-
|
263 |
-
|
264 |
-
|
265 |
-
|
266 |
-
|
267 |
-
|
268 |
-
|
|
|
|
|
|
|
269 |
|
270 |
if (error) {
|
271 |
return (
|
@@ -297,7 +54,7 @@ export function HomePage() {
|
|
297 |
<EmbeddingModels config={config} />
|
298 |
</TabsContent>
|
299 |
<TabsContent value="others">
|
300 |
-
<Others />
|
301 |
</TabsContent>
|
302 |
</Tabs>
|
303 |
</div>
|
|
|
1 |
+
import { useEffect } from "react";
|
|
|
|
|
2 |
import { Tabs, TabsList, TabsTrigger, TabsContent } from "@/components/ui/tabs";
|
3 |
+
import { useConfig } from "./hooks";
|
4 |
+
import { useLoading } from "@/contexts/loading-context";
|
5 |
+
import { Providers } from "./components/Providers";
|
6 |
+
import { ChatModels } from "./components/ChatModels";
|
7 |
+
import { EmbeddingModels } from "./components/EmbeddingModels";
|
8 |
+
import { Others } from "./components/Others";
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
|
10 |
export function HomePage() {
|
11 |
const { data: config, isLoading, error } = useConfig();
|
12 |
+
const { startLoading, stopLoading } = useLoading();
|
13 |
+
|
14 |
+
// Show loading screen during initial config load
|
15 |
+
useEffect(() => {
|
16 |
+
if (isLoading) {
|
17 |
+
startLoading("Loading configuration...");
|
18 |
+
} else {
|
19 |
+
stopLoading();
|
20 |
+
}
|
21 |
+
|
22 |
+
return () => {
|
23 |
+
stopLoading();
|
24 |
+
};
|
25 |
+
}, [isLoading, startLoading, stopLoading]);
|
26 |
|
27 |
if (error) {
|
28 |
return (
|
|
|
54 |
<EmbeddingModels config={config} />
|
55 |
</TabsContent>
|
56 |
<TabsContent value="others">
|
57 |
+
<Others config={config} />
|
58 |
</TabsContent>
|
59 |
</Tabs>
|
60 |
</div>
|
src/pages/home/types.ts
ADDED
@@ -0,0 +1,27 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { IConfig, BaseModel } from "@/lib/config/types";
|
2 |
+
|
3 |
+
export interface ProvidersProps {
|
4 |
+
config: IConfig | undefined;
|
5 |
+
}
|
6 |
+
|
7 |
+
export interface ModelListProps {
|
8 |
+
type: 'chat' | 'embedding';
|
9 |
+
models: BaseModel[];
|
10 |
+
config: IConfig | undefined;
|
11 |
+
enabledModels: string[];
|
12 |
+
defaultModel: string;
|
13 |
+
title: string;
|
14 |
+
description: string;
|
15 |
+
}
|
16 |
+
|
17 |
+
export interface ChatModelsProps {
|
18 |
+
config: IConfig | undefined;
|
19 |
+
}
|
20 |
+
|
21 |
+
export interface EmbeddingModelsProps {
|
22 |
+
config: IConfig | undefined;
|
23 |
+
}
|
24 |
+
|
25 |
+
export interface OthersProps {
|
26 |
+
config: IConfig | undefined;
|
27 |
+
}
|
src/pages/integrations/huggingface-callback.tsx
ADDED
@@ -0,0 +1,103 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import { useEffect, useState } from "react";
|
2 |
+
import { useNavigate } from "react-router-dom";
|
3 |
+
import { ConfigManager } from "@/lib/config/manager";
|
4 |
+
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
5 |
+
import { Button } from "@/components/ui/button";
|
6 |
+
import { Loader2 } from "lucide-react";
|
7 |
+
|
8 |
+
export function HuggingFaceCallback() {
|
9 |
+
const navigate = useNavigate();
|
10 |
+
const [status, setStatus] = useState<"loading" | "success" | "error">("loading");
|
11 |
+
const [error, setError] = useState<string | null>(null);
|
12 |
+
|
13 |
+
useEffect(() => {
|
14 |
+
async function handleCallback() {
|
15 |
+
try {
|
16 |
+
// Get the authorization code from the URL
|
17 |
+
const urlParams = new URLSearchParams(window.location.search);
|
18 |
+
const code = urlParams.get("code");
|
19 |
+
|
20 |
+
if (!code) {
|
21 |
+
setStatus("error");
|
22 |
+
setError("Authorization code not found in URL");
|
23 |
+
return;
|
24 |
+
}
|
25 |
+
|
26 |
+
// Exchange the code for an access token
|
27 |
+
const response = await fetch("https://huggingface.co/oauth/token", {
|
28 |
+
method: "POST",
|
29 |
+
headers: {
|
30 |
+
"Content-Type": "application/json",
|
31 |
+
},
|
32 |
+
body: JSON.stringify({
|
33 |
+
grant_type: "authorization_code",
|
34 |
+
client_id: import.meta.env.VITE_HF_CLIENT_ID,
|
35 |
+
client_secret: import.meta.env.VITE_HF_CLIENT_SECRET,
|
36 |
+
redirect_uri: import.meta.env.VITE_HF_REDIRECT_URI,
|
37 |
+
code,
|
38 |
+
}),
|
39 |
+
});
|
40 |
+
|
41 |
+
if (!response.ok) {
|
42 |
+
const errorData = await response.json();
|
43 |
+
throw new Error(errorData.error_description || "Failed to exchange code for token");
|
44 |
+
}
|
45 |
+
|
46 |
+
const data = await response.json();
|
47 |
+
const accessToken = data.access_token;
|
48 |
+
|
49 |
+
// Store the access token in the config
|
50 |
+
const configManager = ConfigManager.getInstance();
|
51 |
+
await configManager.updateConfig({ hf_token: accessToken });
|
52 |
+
|
53 |
+
setStatus("success");
|
54 |
+
} catch (err) {
|
55 |
+
console.error("Error during OAuth callback:", err);
|
56 |
+
setStatus("error");
|
57 |
+
setError(err instanceof Error ? err.message : "Unknown error occurred");
|
58 |
+
}
|
59 |
+
}
|
60 |
+
|
61 |
+
handleCallback();
|
62 |
+
}, [navigate]);
|
63 |
+
|
64 |
+
const handleNavigateHome = () => {
|
65 |
+
navigate("/");
|
66 |
+
};
|
67 |
+
|
68 |
+
return (
|
69 |
+
<div className="flex items-center justify-center min-h-screen">
|
70 |
+
<Card className="w-[400px]">
|
71 |
+
<CardHeader>
|
72 |
+
<CardTitle>Hugging Face Integration</CardTitle>
|
73 |
+
<CardDescription>
|
74 |
+
{status === "loading" && "Setting up your Hugging Face integration..."}
|
75 |
+
{status === "success" && "Successfully connected to Hugging Face!"}
|
76 |
+
{status === "error" && "Failed to connect to Hugging Face"}
|
77 |
+
</CardDescription>
|
78 |
+
</CardHeader>
|
79 |
+
<CardContent className="flex flex-col items-center gap-4">
|
80 |
+
{status === "loading" && <Loader2 className="h-8 w-8 animate-spin" />}
|
81 |
+
|
82 |
+
{status === "success" && (
|
83 |
+
<>
|
84 |
+
<div className="text-center text-sm text-muted-foreground">
|
85 |
+
Your Hugging Face API token has been saved. You can now use Hugging Face models.
|
86 |
+
</div>
|
87 |
+
<Button onClick={handleNavigateHome}>Continue to Home</Button>
|
88 |
+
</>
|
89 |
+
)}
|
90 |
+
|
91 |
+
{status === "error" && (
|
92 |
+
<>
|
93 |
+
<div className="text-center text-sm text-destructive">
|
94 |
+
{error || "An error occurred while connecting to Hugging Face."}
|
95 |
+
</div>
|
96 |
+
<Button variant="outline" onClick={handleNavigateHome}>Back to Home</Button>
|
97 |
+
</>
|
98 |
+
)}
|
99 |
+
</CardContent>
|
100 |
+
</Card>
|
101 |
+
</div>
|
102 |
+
);
|
103 |
+
}
|
src/polyfills.ts
CHANGED
@@ -2,7 +2,7 @@ import { Buffer } from 'buffer';
|
|
2 |
|
3 |
// Make Buffer available globally
|
4 |
if (typeof window !== 'undefined') {
|
5 |
-
(window as
|
6 |
}
|
7 |
|
8 |
export {};
|
|
|
2 |
|
3 |
// Make Buffer available globally
|
4 |
if (typeof window !== 'undefined') {
|
5 |
+
(window as unknown as { Buffer: typeof Buffer }).Buffer = Buffer;
|
6 |
}
|
7 |
|
8 |
export {};
|
src/routes.tsx
CHANGED
@@ -7,5 +7,5 @@ export const routes = [
|
|
7 |
label: "Home",
|
8 |
icon: <HomeIcon />,
|
9 |
component: <HomePage />,
|
10 |
-
}
|
11 |
-
];
|
|
|
7 |
label: "Home",
|
8 |
icon: <HomeIcon />,
|
9 |
component: <HomePage />,
|
10 |
+
}
|
11 |
+
];
|
test/Dockerfile
ADDED
@@ -0,0 +1,137 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
FROM i386/debian:buster-slim
|
2 |
+
|
3 |
+
# Set environment variables
|
4 |
+
ENV DEBIAN_FRONTEND=noninteractive
|
5 |
+
ENV PYTHONUNBUFFERED=1
|
6 |
+
ENV PATH="/root/.local/bin:$PATH"
|
7 |
+
|
8 |
+
# Install system dependencies
|
9 |
+
RUN apt-get update && apt-get install -y --no-install-recommends \
|
10 |
+
build-essential \
|
11 |
+
ca-certificates \
|
12 |
+
curl \
|
13 |
+
git \
|
14 |
+
libzmq3-dev \
|
15 |
+
pkg-config \
|
16 |
+
python3 \
|
17 |
+
python3-dev \
|
18 |
+
python3-pip \
|
19 |
+
python3-setuptools \
|
20 |
+
python3-wheel \
|
21 |
+
wget \
|
22 |
+
&& apt-get clean \
|
23 |
+
&& rm -rf /var/lib/apt/lists/*
|
24 |
+
|
25 |
+
# Upgrade pip
|
26 |
+
RUN python3 -m pip install --no-cache-dir --upgrade pip
|
27 |
+
|
28 |
+
# Install core Jupyter components
|
29 |
+
RUN python3 -m pip install --no-cache-dir \
|
30 |
+
jupyter \
|
31 |
+
jupyterlab \
|
32 |
+
notebook \
|
33 |
+
voila
|
34 |
+
|
35 |
+
# Install data science libraries
|
36 |
+
RUN python3 -m pip install --no-cache-dir \
|
37 |
+
numpy \
|
38 |
+
pandas \
|
39 |
+
matplotlib \
|
40 |
+
scipy \
|
41 |
+
scikit-learn \
|
42 |
+
seaborn \
|
43 |
+
statsmodels
|
44 |
+
|
45 |
+
# Install file processing libraries
|
46 |
+
RUN python3 -m pip install --no-cache-dir \
|
47 |
+
openpyxl \
|
48 |
+
xlrd \
|
49 |
+
xlwt \
|
50 |
+
pyarrow \
|
51 |
+
fastparquet \
|
52 |
+
python-docx \
|
53 |
+
pdfminer.six \
|
54 |
+
beautifulsoup4 \
|
55 |
+
lxml \
|
56 |
+
pillow
|
57 |
+
|
58 |
+
# Install WASM-friendly packages
|
59 |
+
RUN python3 -m pip install --no-cache-dir \
|
60 |
+
pyodide-pack
|
61 |
+
|
62 |
+
# Set up workspace directory
|
63 |
+
WORKDIR /workspace
|
64 |
+
RUN mkdir -p /workspace/data
|
65 |
+
|
66 |
+
# Configure Jupyter
|
67 |
+
RUN mkdir -p /root/.jupyter && \
|
68 |
+
echo "c.NotebookApp.ip = '0.0.0.0'" >> /root/.jupyter/jupyter_notebook_config.py && \
|
69 |
+
echo "c.NotebookApp.port = 8888" >> /root/.jupyter/jupyter_notebook_config.py && \
|
70 |
+
echo "c.NotebookApp.open_browser = False" >> /root/.jupyter/jupyter_notebook_config.py && \
|
71 |
+
echo "c.NotebookApp.allow_root = True" >> /root/.jupyter/jupyter_notebook_config.py && \
|
72 |
+
echo "c.NotebookApp.token = ''" >> /root/.jupyter/jupyter_notebook_config.py && \
|
73 |
+
echo "c.NotebookApp.password = ''" >> /root/.jupyter/jupyter_notebook_config.py
|
74 |
+
|
75 |
+
# Entry point script for WASM compatibility
|
76 |
+
COPY <<EOT /workspace/prepare_for_wasm.py
|
77 |
+
import os
|
78 |
+
import sys
|
79 |
+
import json
|
80 |
+
import subprocess
|
81 |
+
|
82 |
+
def collect_dependencies():
|
83 |
+
"""Collect all Python dependencies for WASM conversion"""
|
84 |
+
result = subprocess.run(
|
85 |
+
[sys.executable, "-m", "pip", "freeze"],
|
86 |
+
capture_output=True,
|
87 |
+
text=True
|
88 |
+
)
|
89 |
+
return result.stdout.strip().split('\n')
|
90 |
+
|
91 |
+
def create_manifest():
|
92 |
+
"""Create a manifest file for WASM conversion"""
|
93 |
+
deps = collect_dependencies()
|
94 |
+
manifest = {
|
95 |
+
"name": "jupyter-datascience-wasm",
|
96 |
+
"version": "1.0.0",
|
97 |
+
"description": "Jupyter notebook with data science libraries for WASM",
|
98 |
+
"dependencies": deps,
|
99 |
+
"main": "server.py",
|
100 |
+
"files": [
|
101 |
+
"server.py",
|
102 |
+
"/root/.jupyter/jupyter_notebook_config.py"
|
103 |
+
]
|
104 |
+
}
|
105 |
+
|
106 |
+
with open("/workspace/wasm_manifest.json", "w") as f:
|
107 |
+
json.dump(manifest, f, indent=2)
|
108 |
+
|
109 |
+
print("Created WASM manifest file at /workspace/wasm_manifest.json")
|
110 |
+
|
111 |
+
if __name__ == "__main__":
|
112 |
+
create_manifest()
|
113 |
+
print("Prepared environment for WASM conversion")
|
114 |
+
EOT
|
115 |
+
|
116 |
+
# Create a simple server script for the WASM environment
|
117 |
+
COPY <<EOT /workspace/server.py
|
118 |
+
#!/usr/bin/env python3
|
119 |
+
import os
|
120 |
+
import sys
|
121 |
+
from notebook.notebookapp import main
|
122 |
+
|
123 |
+
if __name__ == "__main__":
|
124 |
+
sys.exit(main())
|
125 |
+
EOT
|
126 |
+
|
127 |
+
# Make the server script executable
|
128 |
+
RUN chmod +x /workspace/server.py
|
129 |
+
|
130 |
+
# Prepare for WASM conversion
|
131 |
+
RUN python3 /workspace/prepare_for_wasm.py
|
132 |
+
|
133 |
+
# Expose the Jupyter port
|
134 |
+
EXPOSE 8888
|
135 |
+
|
136 |
+
# Set the default command to start the Jupyter server
|
137 |
+
CMD ["python3", "/workspace/server.py"]
|
test/jupyter-wasm-launcher.html
ADDED
@@ -0,0 +1,85 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
<!DOCTYPE html>
|
2 |
+
<html lang="en">
|
3 |
+
<head>
|
4 |
+
<meta charset="UTF-8">
|
5 |
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
6 |
+
<title>Jupyter WASM Instance</title>
|
7 |
+
<style>
|
8 |
+
body {
|
9 |
+
font-family: Arial, sans-serif;
|
10 |
+
max-width: 800px;
|
11 |
+
margin: 0 auto;
|
12 |
+
padding: 20px;
|
13 |
+
}
|
14 |
+
button {
|
15 |
+
padding: 10px 15px;
|
16 |
+
font-size: 16px;
|
17 |
+
background-color: #4CAF50;
|
18 |
+
color: white;
|
19 |
+
border: none;
|
20 |
+
border-radius: 4px;
|
21 |
+
cursor: pointer;
|
22 |
+
}
|
23 |
+
button:hover {
|
24 |
+
background-color: #45a049;
|
25 |
+
}
|
26 |
+
#status {
|
27 |
+
margin-top: 20px;
|
28 |
+
padding: 15px;
|
29 |
+
border-radius: 4px;
|
30 |
+
background-color: #f8f9fa;
|
31 |
+
}
|
32 |
+
#jupyter-frame {
|
33 |
+
width: 100%;
|
34 |
+
height: 600px;
|
35 |
+
border: 1px solid #ddd;
|
36 |
+
margin-top: 20px;
|
37 |
+
display: none;
|
38 |
+
}
|
39 |
+
</style>
|
40 |
+
</head>
|
41 |
+
<body>
|
42 |
+
<h1>Jupyter WASM Instance</h1>
|
43 |
+
<p>This page runs a Jupyter notebook server directly in your browser using WebAssembly.</p>
|
44 |
+
|
45 |
+
<button id="start-button">Start Jupyter Server</button>
|
46 |
+
<div id="status">Status: Ready to launch</div>
|
47 |
+
<iframe id="jupyter-frame" sandbox="allow-scripts allow-same-origin"></iframe>
|
48 |
+
|
49 |
+
<script>
|
50 |
+
document.getElementById('start-button').addEventListener('click', async () => {
|
51 |
+
const statusEl = document.getElementById('status');
|
52 |
+
const frameEl = document.getElementById('jupyter-frame');
|
53 |
+
|
54 |
+
statusEl.textContent = 'Status: Loading WASM module...';
|
55 |
+
|
56 |
+
try {
|
57 |
+
// Import the WASM module
|
58 |
+
const wasmModule = await WebAssembly.instantiateStreaming(
|
59 |
+
fetch('jupyter-datascience.wasm'),
|
60 |
+
{}
|
61 |
+
);
|
62 |
+
|
63 |
+
statusEl.textContent = 'Status: Starting Jupyter server...';
|
64 |
+
|
65 |
+
// Initialize the WASM instance
|
66 |
+
const instance = wasmModule.instance;
|
67 |
+
await instance.exports._start();
|
68 |
+
|
69 |
+
// Connect to the virtual server
|
70 |
+
const port = 8888; // The port defined in the Docker image
|
71 |
+
const url = `http://localhost:${port}`;
|
72 |
+
|
73 |
+
statusEl.textContent = 'Status: Jupyter server running at ' + url;
|
74 |
+
|
75 |
+
// Display in iframe
|
76 |
+
frameEl.style.display = 'block';
|
77 |
+
frameEl.src = url;
|
78 |
+
} catch (error) {
|
79 |
+
statusEl.textContent = 'Status: Error - ' + error.message;
|
80 |
+
console.error(error);
|
81 |
+
}
|
82 |
+
});
|
83 |
+
</script>
|
84 |
+
</body>
|
85 |
+
</html>
|
test/start.sh
ADDED
@@ -0,0 +1,115 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
#!/bin/bash
|
2 |
+
# Script to build and convert the Docker image to WASM
|
3 |
+
|
4 |
+
# Step 1: Build the Docker image
|
5 |
+
echo "Building the 32-bit Jupyter Docker image..."
|
6 |
+
docker build -t jupyter-datascience-32bit -f Dockerfile .
|
7 |
+
|
8 |
+
# Step 2: Install Docker-WASM tools
|
9 |
+
echo "Installing Docker-WASM conversion tools..."
|
10 |
+
npm install -g @wasmer/wasm-transformer
|
11 |
+
npm install -g docker-wasm-cli
|
12 |
+
|
13 |
+
# Step 3: Export the Docker image
|
14 |
+
echo "Exporting Docker image to a tarball..."
|
15 |
+
docker save jupyter-datascience-32bit -o jupyter-datascience-32bit.tar
|
16 |
+
|
17 |
+
# Step 4: Convert the Docker image to WASM
|
18 |
+
echo "Converting Docker image to WASM format..."
|
19 |
+
docker-wasm convert jupyter-datascience-32bit.tar --output jupyter-datascience.wasm
|
20 |
+
|
21 |
+
# Step 5: Create a simple HTML launcher
|
22 |
+
echo "Creating HTML launcher for the WASM module..."
|
23 |
+
cat > jupyter-wasm-launcher.html << 'EOL'
|
24 |
+
<!DOCTYPE html>
|
25 |
+
<html lang="en">
|
26 |
+
<head>
|
27 |
+
<meta charset="UTF-8">
|
28 |
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
29 |
+
<title>Jupyter WASM Instance</title>
|
30 |
+
<style>
|
31 |
+
body {
|
32 |
+
font-family: Arial, sans-serif;
|
33 |
+
max-width: 800px;
|
34 |
+
margin: 0 auto;
|
35 |
+
padding: 20px;
|
36 |
+
}
|
37 |
+
button {
|
38 |
+
padding: 10px 15px;
|
39 |
+
font-size: 16px;
|
40 |
+
background-color: #4CAF50;
|
41 |
+
color: white;
|
42 |
+
border: none;
|
43 |
+
border-radius: 4px;
|
44 |
+
cursor: pointer;
|
45 |
+
}
|
46 |
+
button:hover {
|
47 |
+
background-color: #45a049;
|
48 |
+
}
|
49 |
+
#status {
|
50 |
+
margin-top: 20px;
|
51 |
+
padding: 15px;
|
52 |
+
border-radius: 4px;
|
53 |
+
background-color: #f8f9fa;
|
54 |
+
}
|
55 |
+
#jupyter-frame {
|
56 |
+
width: 100%;
|
57 |
+
height: 600px;
|
58 |
+
border: 1px solid #ddd;
|
59 |
+
margin-top: 20px;
|
60 |
+
display: none;
|
61 |
+
}
|
62 |
+
</style>
|
63 |
+
</head>
|
64 |
+
<body>
|
65 |
+
<h1>Jupyter WASM Instance</h1>
|
66 |
+
<p>This page runs a Jupyter notebook server directly in your browser using WebAssembly.</p>
|
67 |
+
|
68 |
+
<button id="start-button">Start Jupyter Server</button>
|
69 |
+
<div id="status">Status: Ready to launch</div>
|
70 |
+
<iframe id="jupyter-frame" sandbox="allow-scripts allow-same-origin"></iframe>
|
71 |
+
|
72 |
+
<script>
|
73 |
+
document.getElementById('start-button').addEventListener('click', async () => {
|
74 |
+
const statusEl = document.getElementById('status');
|
75 |
+
const frameEl = document.getElementById('jupyter-frame');
|
76 |
+
|
77 |
+
statusEl.textContent = 'Status: Loading WASM module...';
|
78 |
+
|
79 |
+
try {
|
80 |
+
// Import the WASM module
|
81 |
+
const wasmModule = await WebAssembly.instantiateStreaming(
|
82 |
+
fetch('jupyter-datascience.wasm'),
|
83 |
+
{}
|
84 |
+
);
|
85 |
+
|
86 |
+
statusEl.textContent = 'Status: Starting Jupyter server...';
|
87 |
+
|
88 |
+
// Initialize the WASM instance
|
89 |
+
const instance = wasmModule.instance;
|
90 |
+
await instance.exports._start();
|
91 |
+
|
92 |
+
// Connect to the virtual server
|
93 |
+
const port = 8888; // The port defined in the Docker image
|
94 |
+
const url = `http://localhost:${port}`;
|
95 |
+
|
96 |
+
statusEl.textContent = 'Status: Jupyter server running at ' + url;
|
97 |
+
|
98 |
+
// Display in iframe
|
99 |
+
frameEl.style.display = 'block';
|
100 |
+
frameEl.src = url;
|
101 |
+
} catch (error) {
|
102 |
+
statusEl.textContent = 'Status: Error - ' + error.message;
|
103 |
+
console.error(error);
|
104 |
+
}
|
105 |
+
});
|
106 |
+
</script>
|
107 |
+
</body>
|
108 |
+
</html>
|
109 |
+
EOL
|
110 |
+
|
111 |
+
echo "Done! You now have:"
|
112 |
+
echo "1. jupyter-datascience.wasm - The WASM module"
|
113 |
+
echo "2. jupyter-wasm-launcher.html - A HTML launcher for the WASM module"
|
114 |
+
echo ""
|
115 |
+
echo "To use, serve these files with a web server that supports WebAssembly and WASM-compatible headers."
|