chansung commited on
Commit
fccc18d
·
verified ·
1 Parent(s): e0bfcd4

Upload folder using huggingface_hub

Browse files
.gitignore CHANGED
@@ -1,3 +1,4 @@
 
1
  .DS_Store
2
 
3
  # Byte-compiled / optimized / DLL files
 
1
+ temp_attachments
2
  .DS_Store
3
 
4
  # Byte-compiled / optimized / DLL files
README.md CHANGED
@@ -325,3 +325,29 @@ $ python main.py # or gradio main.py
325
 
326
  # Acknowledgments
327
  This is a project built during the Vertex sprints held by Google's ML Developer Programs team. We are thankful to be granted good amount of GCP credits to do this project.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
325
 
326
  # Acknowledgments
327
  This is a project built during the Vertex sprints held by Google's ML Developer Programs team. We are thankful to be granted good amount of GCP credits to do this project.
328
+ # AdaptSum
329
+
330
+ [![Sync to Hugging Face Spaces](https://github.com/deep-diver/AdaptSum/actions/workflows/sync_to_spaces.yml/badge.svg)](https://github.com/deep-diver/AdaptSum/actions/workflows/sync_to_spaces.yml)
331
+
332
+ AdaptSum stands for Adaptive Summarization. This project focuses on developing an LLM-powered system for dynamic summarization. Instead of generating entirely new summaries with each update, the system intelligently identifies and modifies only the necessary parts of the existing summary. This approach aims to create a more efficient and fluid summarization process within a continuous chat interaction with an LLM.
333
+
334
+ # Instructions
335
+
336
+ 1. Install dependencies
337
+ ```shell
338
+ $ pip install requirements.txt
339
+ ```
340
+
341
+ 2. Setup Gemini API Key
342
+ ```shell
343
+ $ export GEMINI_API_KEY=xxxxx
344
+ ```
345
+ > note that GEMINI API KEY should be obtained from Google AI Studio. Vertex AI is not supported at the moment (this is because Gemini SDK does not provide file uploading functionality for Vertex AI usage now).
346
+
347
+ 3. Run Gradio app
348
+ ```shell
349
+ $ python main.py # or gradio main.py
350
+ ```
351
+
352
+ # Acknowledgments
353
+ This is a project built during the Vertex sprints held by Google's ML Developer Programs team. We are thankful to be granted good amount of GCP credits to do this project.
standalone/front/assets/favicon-32x32.png ADDED
standalone/front/assets/horizontal.svg ADDED
standalone/front/assets/new-indicator.svg ADDED
standalone/front/assets/send.svg ADDED
standalone/front/assets/settings.svg ADDED
standalone/front/assets/stop.svg ADDED
standalone/front/assets/summarize.svg ADDED
standalone/front/assets/vertical.svg ADDED
standalone/front/index.html ADDED
@@ -0,0 +1,163 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ <!DOCTYPE html>
2
+ <html lang="en">
3
+ <head>
4
+ <meta charset="UTF-8" />
5
+ <meta name="viewport" content="width=device-width, initial-scale=1" />
6
+ <title>Chat UI with Streaming ChatGPT API, Markdown Messages, Layout Toggle, Summary & Settings</title>
7
+ <link href="https://fonts.googleapis.com/css2?family=Poppins:wght@400;500&display=swap" rel="stylesheet">
8
+ <!-- Include marked.js for Markdown rendering -->
9
+ <script src="https://cdn.jsdelivr.net/npm/marked/marked.min.js"></script>
10
+ <link rel="stylesheet" href="styles/style.css">
11
+ <link rel="icon" type="image/png" href="assets/favicon-32x32.png">
12
+ <link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/atom-one-dark.min.css">
13
+ <script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/highlight.min.js"></script>
14
+ </head>
15
+ <body>
16
+ <div class="app-container">
17
+ <!-- Left Navigation Bar -->
18
+ <div class="nav-bar" id="navBar">
19
+ <div class="nav-header">
20
+ <!-- Hamburger & New Chat will appear here when nav is NOT collapsed -->
21
+ <button id="hamburgerBtn">&#9776;</button>
22
+ <button class="new-session-btn" id="newSessionBtn">
23
+ <img src="assets/new-indicator.svg" alt="Icon" class="svg-icon">
24
+ </button>
25
+ <button id="toggleLayoutBtn"><img src="assets/vertical.svg" alt="Icon" class="svg-icon"></button>
26
+ </div>
27
+ <!-- <h3>Chat History</h3> -->
28
+ <ul id="sessionList"></ul>
29
+ </div>
30
+
31
+ <!-- Main Chat Wrapper -->
32
+ <div class="chat-wrapper">
33
+ <!-- Chat Header -->
34
+ <div class="chat-header">
35
+ <!-- Left portion for hamburger & new chat when collapsed -->
36
+ <div class="header-left" id="headerLeft"></div>
37
+
38
+ <!-- Center portion for the chat title -->
39
+ <div class="chat-title-controls">
40
+ <h2 id="chatTitle">Chat Session 1</h2>
41
+ <button id="editTitleBtn">Edit Title</button>
42
+ </div>
43
+
44
+ <!-- Right portion for turn label -->
45
+ <div id="turnLabel">Turn: 0 / 0</div>
46
+ </div>
47
+
48
+ <!-- Carousel Wrapper -->
49
+ <div class="carousel-wrapper" id="carouselWrapper">
50
+ <button id="prevBtn" class="nav">
51
+ <svg width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="#444" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
52
+ <polyline points="15 18 9 12 15 6"></polyline>
53
+ </svg>
54
+ </button>
55
+ <div class="carousel" id="carousel"></div>
56
+ <button id="nextBtn" class="nav">
57
+ <svg width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="#444" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
58
+ <polyline points="9 18 15 12 9 6"></polyline>
59
+ </svg>
60
+ </button>
61
+ </div>
62
+
63
+ <!-- Input Section -->
64
+ <div class="input-container">
65
+ <div class="button-row">
66
+ <button id="customBtn1">TLDR</button>
67
+ <button id="customBtn3" class="toggle-btn-summarize"><img src="assets/summarize.svg" alt="Icon" class="svg-icon"></button>
68
+ <button id="customBtn2"><img src="assets/settings.svg" alt="Icon" class="svg-icon"></button>
69
+ </div>
70
+ <div id="fileAttachments" class="file-attachments"></div>
71
+ <div class="input-row">
72
+ <button id="attachBtn" class="attach-button">+</button>
73
+ <input type="file" id="fileInput" multiple accept="application/pdf">
74
+ <textarea id="chatInput" placeholder="Ask Anything"></textarea>
75
+ <button id="sendBtn">
76
+ <img src="assets/send.svg" alt="Icon" class="svg-icon-non-white">
77
+ </button>
78
+ </div>
79
+ </div>
80
+ </div>
81
+ </div>
82
+
83
+ <!-- Summary Overlay -->
84
+ <div id="summaryOverlay">
85
+ <div class="summary-header">
86
+ <span>Chat Summary</span>
87
+ <div class="summary-header-buttons">
88
+ <button id="downloadSummary" class="download-summary">Download</button>
89
+ <button class="close-summary" id="closeSummaryBtn">&times;</button>
90
+ </div>
91
+ </div>
92
+ <div class="summary-content markdown-body" id="summaryContent"></div>
93
+ </div>
94
+
95
+ <!-- Settings Overlay -->
96
+ <div id="settingsOverlay">
97
+ <div class="settings-header">
98
+ <span>Settings</span>
99
+ <button class="close-settings" id="closeSettingsBtn">&times;</button>
100
+ </div>
101
+ <div class="settings-content">
102
+ <form id="settingsForm">
103
+ <div class="settings-group">
104
+ <label for="temperature">Temperature:</label>
105
+ <input type="range" id="temperature" name="temperature" min="0" max="1" step="0.01">
106
+ <span id="temperatureValue"></span>
107
+ </div>
108
+ <div class="settings-group">
109
+ <label for="maxTokens">Max Tokens:</label>
110
+ <input type="number" id="maxTokens" name="maxTokens" min="10" max="2048">
111
+ </div>
112
+ <div class="settings-section">
113
+ <h3>Model Selection</h3>
114
+ <div class="settings-group">
115
+ <select id="modelSelect" name="modelSelect">
116
+ <optgroup label="OpenAI">
117
+ <option value="gpt-4o">gpt-4o</option>
118
+ <option value="gpt-4o-mini">gpt-4o-mini</option>
119
+ </optgroup>
120
+ <optgroup label="Anthropic">
121
+ <option value="claude-3.5-sonnet">claude-3.5-sonnet</option>
122
+ <option value="claude-3.7-sonnet">claude-3.7-sonnet</option>
123
+ </optgroup>
124
+ <optgroup label="Google">
125
+ <option value="gemini-2.0-flash">Gemini 2.0 Flash</option>
126
+ <option value="gemini-2.0-flash-lite">Gemini 2.0 Flash Lite</option>
127
+ </optgroup>
128
+ <optgroup label="Hugging Face">
129
+ <option value="huggingface/meta-llama/Llama-3.3-70B-Instruct">Llama 3.3 70B Instruct</option>
130
+ <option value="huggingface/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B">DeepSeek R1 Distill Qwen 32B</option>
131
+ <option value="huggingface/Qwen/Qwen2.5-72B-Instruct">Qwen 2.5 72B Instruct</option>
132
+ </optgroup>
133
+ <optgroup label="Mistral">
134
+ <option value="mistral-large-latest">Mistral Large</option>
135
+ <option value="mistral-codestral-latest">CodeStral</option>
136
+ <option value="mistral-ministral-8b-latest">MiniStral 8B</option>
137
+ </optgroup>
138
+ </select>
139
+ </div>
140
+ </div>
141
+ <div class="settings-group">
142
+ <label for="persona">Persona:</label>
143
+ <select id="persona" name="persona">
144
+ <option value="professional">Professional</option>
145
+ <option value="friendly">Friendly</option>
146
+ </select>
147
+ </div>
148
+ <button type="button" id="saveSettings" class="save-settings">Save Settings</button>
149
+ </form>
150
+ </div>
151
+ </div>
152
+
153
+ <div id="loadingOverlay" class="loading-overlay">
154
+ <div class="loading-animation">
155
+ <div class="ripple"></div>
156
+ <div class="ripple"></div>
157
+ </div>
158
+ <p>Generating summary...</p>
159
+ </div>
160
+
161
+ <script type="module" src="scripts/main.js"></script>
162
+ </body>
163
+ </html>
standalone/front/scripts/api.js ADDED
@@ -0,0 +1,249 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // api.js
2
+ import { updateLastMessage } from './utils.js';
3
+ import { sessions, currentSessionIndex } from './sessions.js';
4
+
5
+ /**
6
+ * Chooses the correct API function based on the model setting.
7
+ */
8
+ export async function callLLMStream(conversation, signal) {
9
+ const session = sessions[currentSessionIndex];
10
+ const { model, temperature, maxTokens } = session.settings;
11
+
12
+ if (model.startsWith("gpt-4o")) {
13
+ return callOpenAIStream(session, conversation, model, temperature, maxTokens, signal);
14
+ } else if (model.startsWith("claude")) {
15
+ return callAnthropicStream(session, conversation, model, temperature, maxTokens, signal);
16
+ } else if (model.startsWith("gemini")) {
17
+ return callGoogleStream(session, conversation, model, temperature, maxTokens, signal);
18
+ } else if (model.startsWith("huggingface")) {
19
+ return callHuggingFaceStream(session, conversation, model.replace("huggingface/", ""), temperature, maxTokens, signal);
20
+ } else if (model.startsWith("mistral")) {
21
+ return callMistralStream(session, conversation, model, temperature, maxTokens, signal);
22
+ } else {
23
+ throw new Error("Unsupported model: " + model);
24
+ }
25
+ }
26
+
27
+ /**
28
+ * Process streaming response from LLM APIs
29
+ */
30
+ async function processStream(reader, decoder, session, errorPrefix = "Stream") {
31
+ let done = false;
32
+ let aiMessage = "";
33
+ let buffer = "";
34
+
35
+ updateLastMessage(session, aiMessage, true);
36
+
37
+ while (!done) {
38
+ const { value, done: doneReading } = await reader.read();
39
+ done = doneReading;
40
+
41
+ // Decode the chunk and append it to the buffer
42
+ buffer += decoder.decode(value, { stream: true });
43
+
44
+ // Process the buffer based on newline delimiters
45
+ const parts = buffer.split("\n\n");
46
+ // Keep the last part in buffer as it might be incomplete
47
+ buffer = parts.pop();
48
+
49
+ // Process complete parts
50
+ for (const part of parts) {
51
+ const trimmed = part.trim();
52
+ if (!trimmed.startsWith("data:")) continue;
53
+
54
+ const dataStr = trimmed.substring(5).trim(); // Remove "data:" prefix
55
+ if (dataStr === "[DONE]") {
56
+ done = true;
57
+ break;
58
+ }
59
+
60
+ try {
61
+ const parsed = JSON.parse(dataStr);
62
+ const delta = parsed.choices[0].delta.content;
63
+ if (delta) {
64
+ aiMessage += delta;
65
+ updateLastMessage(session, aiMessage, true);
66
+ }
67
+ } catch (err) {
68
+ console.error(`${errorPrefix} parsing error:`, err, "Chunk:", dataStr);
69
+ }
70
+ }
71
+ }
72
+
73
+ // Process any remaining buffered data
74
+ if (buffer.trim()) {
75
+ try {
76
+ const trimmed = buffer.trim();
77
+ if (trimmed.startsWith("data:")) {
78
+ const dataStr = trimmed.substring(5).trim();
79
+ if (dataStr !== "[DONE]") {
80
+ const parsed = JSON.parse(dataStr);
81
+ const delta = parsed.choices[0].delta.content;
82
+ if (delta) {
83
+ aiMessage += delta;
84
+ updateLastMessage(session, aiMessage, true);
85
+ }
86
+ }
87
+ }
88
+ } catch (err) {
89
+ console.error(`Final buffer parsing error:`, err, "Buffer:", buffer);
90
+ }
91
+ }
92
+
93
+ updateLastMessage(session, aiMessage, false);
94
+ return aiMessage;
95
+ }
96
+
97
+ /**
98
+ * Helper function to create API request options
99
+ */
100
+ function createRequestOptions(session, payload, signal) {
101
+ return {
102
+ method: "POST",
103
+ headers: {
104
+ "Content-Type": "application/json",
105
+ "X-Session-ID": session.id,
106
+ },
107
+ body: JSON.stringify(payload),
108
+ signal: signal,
109
+ };
110
+ }
111
+
112
+ export async function callOpenAIStream(session, conversation, model, temperature, maxTokens, signal) {
113
+ const response = await fetch("http://127.0.0.1:8000/openai_stream",
114
+ createRequestOptions(session, {
115
+ conversation: conversation,
116
+ temperature: temperature,
117
+ max_tokens: maxTokens,
118
+ model: model,
119
+ }, signal)
120
+ );
121
+
122
+ const reader = response.body.getReader();
123
+ const decoder = new TextDecoder("utf-8");
124
+ return processStream(reader, decoder, session, "OpenAI stream");
125
+ }
126
+
127
+ export async function callAnthropicStream(session, conversation, model, temperature, maxTokens, signal) {
128
+ model = model.toLowerCase().replace(/\s+/g, '-').replace(/\./g, '-');
129
+ console.log(`Calling Anthropic API with model: ${model}`);
130
+
131
+ const response = await fetch("http://127.0.0.1:8000/anthropic_stream",
132
+ createRequestOptions(session, {
133
+ messages: conversation,
134
+ temperature: temperature,
135
+ max_tokens: maxTokens,
136
+ model: model + "-latest",
137
+ }, signal)
138
+ );
139
+
140
+ const reader = response.body.getReader();
141
+ const decoder = new TextDecoder("utf-8");
142
+ return processStream(reader, decoder, session, "Anthropic stream");
143
+ }
144
+
145
+ export async function callGoogleStream(session, conversation, model, temperature, maxTokens, signal) {
146
+ model = model.toLowerCase().replace(/\s+/g, '-');
147
+ console.log(`Calling Google (Gemini) API with model: ${model}`);
148
+
149
+ const response = await fetch("http://127.0.0.1:8000/gemini_stream",
150
+ createRequestOptions(session, {
151
+ messages: conversation,
152
+ temperature: temperature,
153
+ max_tokens: maxTokens,
154
+ model: model,
155
+ }, signal)
156
+ );
157
+
158
+ const reader = response.body.getReader();
159
+ const decoder = new TextDecoder("utf-8");
160
+ return processStream(reader, decoder, session, "Gemini stream");
161
+ }
162
+
163
+ export async function callHuggingFaceStream(session, conversation, model, temperature, maxTokens, signal) {
164
+ console.log(`Calling Hugging Face API with model: ${model}`);
165
+
166
+ const response = await fetch("http://127.0.0.1:8000/huggingface_stream",
167
+ createRequestOptions(session, {
168
+ messages: conversation,
169
+ temperature: temperature,
170
+ max_tokens: maxTokens,
171
+ model: model,
172
+ }, signal)
173
+ );
174
+
175
+ const reader = response.body.getReader();
176
+ const decoder = new TextDecoder("utf-8");
177
+ return processStream(reader, decoder, session, "Hugging Face stream");
178
+ }
179
+
180
+ export async function callMistralStream(session, conversation, model, temperature, maxTokens, signal) {
181
+ console.log(`Calling Mistral API with model: ${model}`);
182
+
183
+ const response = await fetch("http://127.0.0.1:8000/mistral_stream",
184
+ createRequestOptions(session, {
185
+ conversation: conversation,
186
+ temperature: temperature,
187
+ max_tokens: maxTokens,
188
+ model: model,
189
+ }, signal)
190
+ );
191
+
192
+ const reader = response.body.getReader();
193
+ const decoder = new TextDecoder("utf-8");
194
+ return processStream(reader, decoder, session, "Mistral stream");
195
+ }
196
+
197
+ /**
198
+ * Makes a batch request to generate a summary of the conversation
199
+ */
200
+ export async function callLLMSummaryBatch(sessionId, conversation, model, temperature, maxTokens) {
201
+ const loadingOverlay = document.getElementById("loadingOverlay");
202
+ loadingOverlay.classList.add("active");
203
+
204
+ // Determine the appropriate endpoint based on the model
205
+ const modelToEndpoint = {
206
+ "gpt-4o": "openai_summary",
207
+ "claude": "anthropic_summary",
208
+ "gemini": "gemini_summary",
209
+ "huggingface": "huggingface_summary",
210
+ "mistral": "mistral_summary"
211
+ };
212
+
213
+ // Find the matching prefix
214
+ const modelPrefix = Object.keys(modelToEndpoint).find(prefix => model.startsWith(prefix));
215
+ if (!modelPrefix) {
216
+ throw new Error("Unsupported model for summary: " + model);
217
+ }
218
+
219
+ const endpoint = `http://127.0.0.1:8000/${modelToEndpoint[modelPrefix]}`;
220
+
221
+ try {
222
+ const response = await fetch(endpoint, {
223
+ method: "POST",
224
+ headers: {
225
+ "Content-Type": "application/json",
226
+ "X-Session-ID": sessionId,
227
+ },
228
+ body: JSON.stringify({
229
+ conversation: conversation,
230
+ temperature: temperature,
231
+ max_tokens: maxTokens,
232
+ model: model,
233
+ }),
234
+ });
235
+
236
+ const responseData = await response.json();
237
+ const summaryText = responseData.summary;
238
+ sessions[currentSessionIndex].summary = summaryText;
239
+
240
+ const summaryOverlay = document.getElementById('summaryOverlay');
241
+ if (summaryOverlay.classList.contains('active')) {
242
+ document.getElementById('summaryContent').innerHTML = marked.parse(summaryText);
243
+ }
244
+
245
+ return summaryText;
246
+ } finally {
247
+ loadingOverlay.classList.remove("active");
248
+ }
249
+ }
standalone/front/scripts/events.js ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // events.js
2
+ import { addConversation, attachedFiles, updateFileAttachments } from './events_helper.js';
3
+ import { sessions, currentSessionIndex, getCurrentCardIndex, setCurrentCardIndex, updateCarousel } from './sessions.js';
4
+ import { updateHamburgerPosition, toggleLayout } from './navigation.js';
5
+ import { updateLastMessage } from './utils.js';
6
+
7
+ // Global variables for streaming state
8
+ window.isStreaming = false;
9
+ window.currentStreamController = null;
10
+
11
+ // File attachment events
12
+ const attachBtn = document.getElementById('attachBtn');
13
+ const fileInput = document.getElementById('fileInput');
14
+ attachBtn.addEventListener('click', () => {
15
+ fileInput.click();
16
+ });
17
+ fileInput.addEventListener('change', () => {
18
+ for (const file of fileInput.files) {
19
+ attachedFiles.push(file);
20
+ }
21
+ fileInput.value = "";
22
+ updateFileAttachments();
23
+ });
24
+
25
+ // Send button event
26
+ const chatInput = document.getElementById('chatInput');
27
+ const sendBtn = document.getElementById('sendBtn');
28
+ sendBtn.addEventListener('click', async () => {
29
+ if (window.isStreaming) {
30
+ if (window.currentStreamController) {
31
+ window.currentStreamController.abort();
32
+ }
33
+ const session = sessions[currentSessionIndex];
34
+ const lastIndex = session.messages.length - 1;
35
+ const currentContent = session.messages[lastIndex].aiResponse;
36
+ // Use the imported updateLastMessage instead of window.updateLastMessage
37
+ const cleanedContent = currentContent.replace(`<span class="blinking-cursor"></span>`, '');
38
+ updateLastMessage(cleanedContent, false);
39
+
40
+ sendBtn.innerHTML = `<img src="assets/send.svg" alt="Send Icon" class="svg-icon-non-white">`;
41
+ window.isStreaming = false;
42
+ return;
43
+ }
44
+ const text = chatInput.value;
45
+ if (text.trim() !== '') {
46
+ await addConversation(text);
47
+ chatInput.value = '';
48
+ chatInput.style.height = '36px';
49
+ }
50
+ });
51
+ chatInput.addEventListener('keydown', function (e) {
52
+ if (e.key === 'Enter' && (e.ctrlKey || e.metaKey)) {
53
+ e.preventDefault();
54
+ sendBtn.click();
55
+ }
56
+ });
57
+
58
+ // Navigation events
59
+ const prevBtn = document.getElementById('prevBtn');
60
+ const nextBtn = document.getElementById('nextBtn');
61
+ prevBtn.addEventListener('click', () => {
62
+ if (getCurrentCardIndex() > 0) {
63
+ setCurrentCardIndex(getCurrentCardIndex() -1);
64
+ }
65
+ });
66
+
67
+ nextBtn.addEventListener('click', () => {
68
+ const cards = document.querySelectorAll('.card');
69
+ if (getCurrentCardIndex() < cards.length - 1) {
70
+ setCurrentCardIndex(getCurrentCardIndex() + 1);
71
+ }
72
+ });
73
+
74
+ // Hamburger toggle
75
+ const hamburgerBtn = document.getElementById('hamburgerBtn');
76
+ const navBar = document.getElementById('navBar');
77
+ hamburgerBtn.addEventListener('click', (e) => {
78
+ e.stopPropagation();
79
+ navBar.classList.toggle('hidden');
80
+ updateHamburgerPosition();
81
+ });
82
+
83
+ // Summary overlay events
84
+ const summaryOverlay = document.getElementById('summaryOverlay');
85
+ const closeSummaryBtn = document.getElementById('closeSummaryBtn');
86
+ const downloadSummaryBtn = document.getElementById('downloadSummary');
87
+ closeSummaryBtn.addEventListener('click', () => {
88
+ summaryOverlay.classList.remove('active');
89
+ });
90
+ downloadSummaryBtn.addEventListener('click', () => {
91
+ const blob = new Blob([sessions[currentSessionIndex].summary], { type: "text/markdown" });
92
+ const url = URL.createObjectURL(blob);
93
+ const a = document.createElement("a");
94
+ a.href = url;
95
+ a.download = "summary.md";
96
+ a.click();
97
+ URL.revokeObjectURL(url);
98
+ });
99
+
100
+ // Settings overlay events
101
+ const settingsOverlay = document.getElementById('settingsOverlay');
102
+ const closeSettingsBtn = document.getElementById('closeSettingsBtn');
103
+ const temperatureInput = document.getElementById('temperature');
104
+ const temperatureValue = document.getElementById('temperatureValue');
105
+ const maxTokensInput = document.getElementById('maxTokens');
106
+ const personaSelect = document.getElementById('persona');
107
+ const saveSettingsBtn = document.getElementById('saveSettings');
108
+ const modelSelect = document.getElementById('modelSelect');
109
+ closeSettingsBtn.addEventListener('click', () => {
110
+ settingsOverlay.classList.remove('active');
111
+ });
112
+ temperatureInput.addEventListener('input', () => {
113
+ temperatureValue.textContent = temperatureInput.value;
114
+ });
115
+ saveSettingsBtn.addEventListener('click', () => {
116
+ const sessionSettings = sessions[currentSessionIndex].settings;
117
+ sessionSettings.temperature = parseFloat(temperatureInput.value);
118
+ sessionSettings.maxTokens = parseInt(maxTokensInput.value);
119
+ sessionSettings.persona = personaSelect.value;
120
+ sessionSettings.model = modelSelect.value;
121
+ console.log('Session settings saved:', sessions[currentSessionIndex].settings);
122
+ settingsOverlay.classList.remove('active');
123
+ });
124
+ const editTitleBtn = document.getElementById('editTitleBtn');
125
+ editTitleBtn.addEventListener('click', () => {
126
+ const currentTitle = sessions[currentSessionIndex].title;
127
+ const newTitle = prompt("Enter new chat title:", currentTitle);
128
+ if (newTitle !== null && newTitle.trim() !== "") {
129
+ sessions[currentSessionIndex].title = newTitle.trim();
130
+ document.getElementById('chatTitle').textContent = newTitle.trim();
131
+ }
132
+ });
133
+
134
+ // Custom buttons for summary and settings
135
+ const customBtn1 = document.getElementById('customBtn1');
136
+ const customBtn2 = document.getElementById('customBtn2');
137
+ customBtn1.addEventListener('click', (e) => {
138
+ e.stopPropagation();
139
+ document.getElementById('summaryContent').innerHTML = marked.parse(sessions[currentSessionIndex].summary);
140
+ summaryOverlay.classList.add('active');
141
+ settingsOverlay.classList.remove('active');
142
+ });
143
+ customBtn2.addEventListener('click', (e) => {
144
+ e.stopPropagation();
145
+ const settings = sessions[currentSessionIndex].settings;
146
+ temperatureInput.value = settings.temperature;
147
+ temperatureValue.textContent = settings.temperature;
148
+ maxTokensInput.value = settings.maxTokens;
149
+ personaSelect.value = settings.persona;
150
+ settingsOverlay.classList.add('active');
151
+ summaryOverlay.classList.remove('active');
152
+ openSettingsForCurrentSession();
153
+ });
154
+ function openSettingsForCurrentSession() {
155
+ const settings = sessions[currentSessionIndex].settings;
156
+ modelSelect.value = settings.model;
157
+ }
158
+
159
+ // Global keyboard navigation
160
+ document.addEventListener('keydown', (e) => {
161
+ const chatInput = document.getElementById('chatInput');
162
+ // Only trigger when the chat input is not focused
163
+ if (document.activeElement !== chatInput) {
164
+ if (e.key === 'ArrowLeft' && getCurrentCardIndex() > 0) {
165
+ setCurrentCardIndex(getCurrentCardIndex() - 1);
166
+ } else if (e.key === 'ArrowRight') {
167
+ const cards = document.querySelectorAll('.card');
168
+ if (getCurrentCardIndex() < cards.length - 1) {
169
+ setCurrentCardIndex(getCurrentCardIndex() + 1);
170
+ }
171
+ }
172
+ }
173
+ });
174
+
175
+ // Auto-dismiss overlays when clicking outside
176
+ document.addEventListener('click', (e) => {
177
+ if (summaryOverlay.classList.contains('active') && !summaryOverlay.contains(e.target)) {
178
+ summaryOverlay.classList.remove('active');
179
+ }
180
+ if (settingsOverlay.classList.contains('active') && !settingsOverlay.contains(e.target)) {
181
+ settingsOverlay.classList.remove('active');
182
+ }
183
+ });
184
+
185
+ const toggleLayoutBtn = document.getElementById('toggleLayoutBtn');
186
+ toggleLayoutBtn.addEventListener('click', () => {
187
+ toggleLayout();
188
+ // If needed, update any other UI parts (like turn labels)
189
+ });
190
+
191
+ const summarizeToggleBtn = document.getElementById('customBtn3');
192
+
193
+ summarizeToggleBtn.addEventListener('click', () => {
194
+ // Toggle the summarization flag for the current session.
195
+ const currentState = sessions[currentSessionIndex].settings.enableSummarization;
196
+ console.log('Current state:', currentState);
197
+ sessions[currentSessionIndex].settings.enableSummarization = !currentState;
198
+
199
+ // Update the button's visual state.
200
+ summarizeToggleBtn.classList.toggle('active', !currentState);
201
+ });
standalone/front/scripts/events_helper.js ADDED
@@ -0,0 +1,112 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // events_helper.js
2
+ import { updateLastMessage, autoResizeTextarea } from './utils.js';
3
+ import { renderCurrentSession, sessions, currentSessionIndex } from './sessions.js';
4
+ import { callLLMStream, callLLMSummaryBatch } from './api.js';
5
+
6
+ export let attachedFiles = [];
7
+
8
+ /**
9
+ * Convert a file to its Base64 string.
10
+ */
11
+ export async function fileToBase64(file) {
12
+ return new Promise((resolve, reject) => {
13
+ const reader = new FileReader();
14
+ reader.onload = () => {
15
+ const base64 = reader.result.split(',')[1];
16
+ resolve({
17
+ name: file.name,
18
+ path: file.webkitRelativePath || file.path || file.name,
19
+ size: file.size,
20
+ type: file.type,
21
+ content: base64,
22
+ });
23
+ };
24
+ reader.onerror = reject;
25
+ reader.readAsDataURL(file);
26
+ });
27
+ }
28
+
29
+ /**
30
+ * Update the file attachments UI.
31
+ */
32
+ export function updateFileAttachments() {
33
+ const fileAttachments = document.getElementById('fileAttachments');
34
+ fileAttachments.innerHTML = "";
35
+ attachedFiles.forEach((file, index) => {
36
+ const fileDiv = document.createElement("div");
37
+ fileDiv.className = "file-item";
38
+ fileDiv.innerHTML = `<span>${file.name}</span> <button data-index="${index}">&times;</button>`;
39
+ fileAttachments.appendChild(fileDiv);
40
+ });
41
+ document.querySelectorAll(".file-item button").forEach(btn => {
42
+ btn.addEventListener("click", (e) => {
43
+ const idx = e.target.getAttribute("data-index");
44
+ attachedFiles.splice(idx, 1);
45
+ updateFileAttachments();
46
+ });
47
+ });
48
+ }
49
+
50
+ /**
51
+ * Add a new conversation message and stream the AI response.
52
+ */
53
+ export async function addConversation(userText) {
54
+ if (userText.trim() === '' && attachedFiles.length === 0) return;
55
+ const session = sessions[currentSessionIndex];
56
+ session.messages.push({
57
+ userText,
58
+ aiResponse: "",
59
+ attachments: await Promise.all(attachedFiles.map(fileToBase64)),
60
+ model: session.settings.model,
61
+ timestamp: new Date().toISOString(),
62
+ maxTokens: session.settings.maxTokens,
63
+ temperature: session.settings.temperature,
64
+ persona: session.settings.persona,
65
+ sessionId: session.id,
66
+ });
67
+
68
+ // Clear attachments
69
+ attachedFiles.length = 0;
70
+ updateFileAttachments();
71
+ renderCurrentSession();
72
+
73
+ // Build conversation context for API
74
+ const conversation = [];
75
+ session.messages.forEach(msg => {
76
+ conversation.push({ role: "user", content: msg.userText, attachments: msg.attachments, sessionId: msg.sessionId });
77
+ if (msg.aiResponse) {
78
+ conversation.push({ role: "assistant", content: msg.aiResponse, sessionId: msg.sessionId });
79
+ }
80
+ });
81
+
82
+ window.currentStreamController = new AbortController();
83
+ window.isStreaming = true;
84
+ const sendBtn = document.getElementById('sendBtn');
85
+ sendBtn.innerHTML = `<img src="assets/stop.svg" alt="Stop Icon" class="svg-icon-non-white">`;
86
+
87
+ try {
88
+ const aiResponse = await callLLMStream(conversation, window.currentStreamController.signal);
89
+ session.messages[session.messages.length - 1].aiResponse = aiResponse;
90
+ renderCurrentSession();
91
+ if (session.settings.enableSummarization) {
92
+ await callLLMSummaryBatch(
93
+ session.id,
94
+ session.messages,
95
+ session.settings.model,
96
+ session.settings.temperature,
97
+ session.settings.maxTokens
98
+ );
99
+ }
100
+ } catch (err) {
101
+ if (err.name === 'AbortError') {
102
+ console.log('Streaming aborted by user.');
103
+ } else {
104
+ console.error(err);
105
+ session.messages[session.messages.length - 1].aiResponse = "Error: " + err.message;
106
+ renderCurrentSession();
107
+ }
108
+ } finally {
109
+ sendBtn.innerHTML = `<img src="assets/send.svg" alt="Send Icon" class="svg-icon-non-white">`;
110
+ window.isStreaming = false;
111
+ }
112
+ }
standalone/front/scripts/main.js ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // main.js
2
+ import { initSessions, sessions, currentSessionIndex, renderCurrentSession } from './sessions.js';
3
+ import { updateHamburgerPosition } from './navigation.js';
4
+
5
+ // Import events so they register their listeners.
6
+ import './events.js';
7
+ // Import utilities to set marked options.
8
+ import './utils.js';
9
+
10
+ // Initialize sessions
11
+ initSessions();
12
+ updateHamburgerPosition();
13
+
14
+ // Attach sessions-related variables to the window for global access
15
+ window.sessions = sessions;
16
+ window.currentSessionIndex = currentSessionIndex;
17
+ window.renderCurrentSession = renderCurrentSession;
standalone/front/scripts/navigation.js ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // navigation.js
2
+
3
+ export let isTraditionalLayout = false;
4
+
5
+ export function updateLayout() {
6
+ const carousel = document.getElementById('carousel');
7
+ const carouselWrapper = document.getElementById('carouselWrapper');
8
+ const toggleLayoutBtn = document.getElementById('toggleLayoutBtn');
9
+ const prevBtn = document.getElementById('prevBtn');
10
+ const nextBtn = document.getElementById('nextBtn');
11
+
12
+ if (isTraditionalLayout) {
13
+ carousel.classList.add('traditional');
14
+ carouselWrapper.classList.add('traditional-mode');
15
+ prevBtn.style.display = 'none';
16
+ nextBtn.style.display = 'none';
17
+ toggleLayoutBtn.innerHTML = `<img src="assets/vertical.svg" alt="Icon" class="svg-icon">`;
18
+ } else {
19
+ carousel.classList.remove('traditional');
20
+ carouselWrapper.classList.remove('traditional-mode');
21
+ prevBtn.style.display = '';
22
+ nextBtn.style.display = '';
23
+ toggleLayoutBtn.innerHTML = `<img src="assets/horizontal.svg" alt="Icon" class="svg-icon">`;
24
+ }
25
+ }
26
+
27
+ export function updateHamburgerPosition() {
28
+ const hamburgerBtn = document.getElementById('hamburgerBtn');
29
+ const newSessionBtn = document.getElementById('newSessionBtn');
30
+ const toggleLayoutBtn = document.getElementById('toggleLayoutBtn');
31
+ const navBar = document.getElementById('navBar');
32
+ const navHeader = navBar.querySelector('.nav-header');
33
+ const headerLeft = document.getElementById('headerLeft');
34
+ if (navBar.classList.contains('hidden')) {
35
+ headerLeft.appendChild(hamburgerBtn);
36
+ headerLeft.appendChild(newSessionBtn);
37
+ headerLeft.appendChild(toggleLayoutBtn);
38
+ } else {
39
+ navHeader.appendChild(hamburgerBtn);
40
+ navHeader.appendChild(newSessionBtn);
41
+ navHeader.appendChild(toggleLayoutBtn);
42
+ }
43
+ }
44
+
45
+ export function toggleLayout() {
46
+ isTraditionalLayout = !isTraditionalLayout;
47
+ updateLayout();
48
+ }
standalone/front/scripts/script.js ADDED
@@ -0,0 +1,942 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ let isStreaming = false;
2
+ let currentStreamController = null;
3
+
4
+ marked.setOptions({
5
+ highlight: function (code, lang) {
6
+ if (lang && hljs.getLanguage(lang)) {
7
+ return hljs.highlight(code, { language: lang }).value;
8
+ }
9
+ return hljs.highlightAuto(code).value;
10
+ },
11
+ });
12
+
13
+ function formatTimestamp(isoString) {
14
+ const date = new Date(isoString);
15
+ return date.toLocaleString([], {
16
+ year: 'numeric',
17
+ month: 'short',
18
+ day: 'numeric',
19
+ hour: '2-digit',
20
+ minute: '2-digit'
21
+ });
22
+ }
23
+
24
+ // Modified to scroll the current card to the bottom after updating the message.
25
+ function updateLastMessage(content, isStreaming = false) {
26
+ const session = sessions[currentSessionIndex];
27
+ const cursorHTML = `<span class="blinking-cursor"></span>`;
28
+ session.messages[session.messages.length - 1].aiResponse = isStreaming ? content + cursorHTML : content;
29
+
30
+ // Re-render the entire conversation
31
+ renderCurrentSession();
32
+
33
+ // Wait until the DOM updates, then highlight code blocks
34
+ requestAnimationFrame(() => {
35
+ document.querySelectorAll('pre code').forEach((block) => {
36
+ hljs.highlightElement(block);
37
+ });
38
+ // Optionally adjust scroll position here as well.
39
+ const lastCard = document.querySelector('.card:last-child');
40
+ if (lastCard) {
41
+ lastCard.scrollTop = isStreaming ? lastCard.scrollHeight : lastCard.scrollTop;
42
+ }
43
+ });
44
+ }
45
+
46
+ // ----------------- Layout and Navigation -----------------
47
+ let isTraditionalLayout = false;
48
+ const toggleLayoutBtn = document.getElementById('toggleLayoutBtn');
49
+ const carouselWrapper = document.getElementById('carouselWrapper');
50
+ function updateLayout() {
51
+ if (isTraditionalLayout) {
52
+ carousel.classList.add('traditional');
53
+ carouselWrapper.classList.add('traditional-mode');
54
+ prevBtn.style.display = 'none';
55
+ nextBtn.style.display = 'none';
56
+ toggleLayoutBtn.innerHTML = `<img src="assets/vertical.svg" alt="Icon" class="svg-icon">`;
57
+ } else {
58
+ carousel.classList.remove('traditional');
59
+ carouselWrapper.classList.remove('traditional-mode');
60
+ prevBtn.style.display = '';
61
+ nextBtn.style.display = '';
62
+ toggleLayoutBtn.innerHTML = `<img src="assets/horizontal.svg" alt="Icon" class="svg-icon">`;
63
+ }
64
+ updateTurnLabel(sessionMessagesCount());
65
+ }
66
+ toggleLayoutBtn.addEventListener('click', function () {
67
+ isTraditionalLayout = !isTraditionalLayout;
68
+ updateLayout();
69
+ });
70
+
71
+ // This function will move hamburger + new chat button between nav-bar and the chat header
72
+
73
+ // ----------------- Session Management -----------------
74
+ let sessions = [];
75
+ let currentSessionIndex = 0;
76
+ let currentCardIndex = 0;
77
+ function initSessions() {
78
+ sessions.push({
79
+ id: Date.now() + '-' + Math.random().toString(36).substr(2, 9),
80
+ name: "Chat Session 1",
81
+ title: "Chat Session 1",
82
+ messages: [],
83
+ summary: "# Chat Summary\n\nThis is the default summary for Chat Session 1.",
84
+ settings: {
85
+ temperature: 0.7,
86
+ maxTokens: 8096,
87
+ persona: "professional",
88
+ model: "gpt-4o-mini",
89
+ enableSummarization: false
90
+ }
91
+ });
92
+ currentSessionIndex = 0;
93
+ currentCardIndex = 0;
94
+ renderSessionList();
95
+ renderCurrentSession();
96
+ }
97
+ function renderSessionList() {
98
+ const sessionList = document.getElementById('sessionList');
99
+ sessionList.innerHTML = "";
100
+ sessions.forEach((session, index) => {
101
+ const li = document.createElement('li');
102
+ const nameSpan = document.createElement('span');
103
+ nameSpan.textContent = session.name;
104
+ li.appendChild(nameSpan);
105
+ const removeBtn = document.createElement('button');
106
+ removeBtn.textContent = "𝘅";
107
+ removeBtn.className = "remove-session";
108
+ removeBtn.addEventListener('click', (e) => {
109
+ e.stopPropagation();
110
+ removeSession(index);
111
+ });
112
+ li.appendChild(removeBtn);
113
+ li.addEventListener('click', () => {
114
+ currentSessionIndex = index;
115
+ currentCardIndex = 0;
116
+ renderSessionList();
117
+ renderCurrentSession();
118
+ });
119
+ if (index === currentSessionIndex) li.classList.add('active');
120
+ sessionList.appendChild(li);
121
+ });
122
+ }
123
+ document.getElementById('newSessionBtn').addEventListener('click', () => {
124
+ const newSession = {
125
+ id: Date.now() + '-' + Math.random().toString(36).substr(2, 9),
126
+ name: "Chat Session " + (sessions.length + 1),
127
+ title: "Chat Session " + (sessions.length + 1),
128
+ messages: [],
129
+ summary: "# Chat Summary\n\nThis is the default summary for Chat Session " + (sessions.length + 1) + ".",
130
+ settings: {
131
+ temperature: 0.7,
132
+ maxTokens: 8096,
133
+ persona: "professional",
134
+ model: "gpt-4o-mini",
135
+ enableSummarization: false
136
+ }
137
+ };
138
+
139
+ sessions.push(newSession);
140
+ currentSessionIndex = sessions.length - 1;
141
+ currentCardIndex = 0;
142
+ renderSessionList();
143
+ renderCurrentSession();
144
+ });
145
+ function removeSession(index) {
146
+ sessions.splice(index, 1);
147
+ if (sessions.length === 0) {
148
+ initSessions();
149
+ } else {
150
+ if (currentSessionIndex >= sessions.length) {
151
+ currentSessionIndex = sessions.length - 1;
152
+ }
153
+ currentCardIndex = 0;
154
+ }
155
+ renderSessionList();
156
+ renderCurrentSession();
157
+ }
158
+ // ----------------- Carousel Rendering / Traditional Layout -----------------
159
+ const carousel = document.getElementById('carousel');
160
+ function addCopyButtons() {
161
+ document.querySelectorAll('.markdown-body pre').forEach((pre) => {
162
+ // Avoid adding multiple copy buttons
163
+ if (pre.querySelector('.copy-button')) return;
164
+
165
+ // Ensure the pre element can position the button correctly
166
+ pre.style.position = "relative";
167
+
168
+ // Create the copy button
169
+ const button = document.createElement("button");
170
+ button.className = "copy-button";
171
+ button.innerText = "Copy";
172
+
173
+ // Append the button inside the <pre> element
174
+ pre.appendChild(button);
175
+
176
+ // Add click event listener for copying the code
177
+ button.addEventListener("click", () => {
178
+ // Get the code text from the child <code> element
179
+ const codeText = pre.querySelector("code").innerText;
180
+ navigator.clipboard.writeText(codeText).then(() => {
181
+ button.innerText = "Copied!";
182
+ setTimeout(() => {
183
+ button.innerText = "Copy";
184
+ }, 2000);
185
+ }).catch((err) => {
186
+ console.error("Failed to copy code: ", err);
187
+ });
188
+ });
189
+ });
190
+ }
191
+
192
+ function renderCurrentSession() {
193
+ const session = sessions[currentSessionIndex];
194
+ carousel.innerHTML = "";
195
+ session.messages.forEach(message => {
196
+ const card = document.createElement('div');
197
+ card.className = 'card';
198
+ let attachmentHTML = "";
199
+ if (message.attachments && message.attachments.length > 0) {
200
+ attachmentHTML = `
201
+ <div class="vertical-file-list">
202
+ ${message.attachments.map(file => `<div class="file-item-vertical">${file.path}</div>`).join("")}
203
+ </div>
204
+ `;
205
+ }
206
+ // Order: User message then AI message, rendered in Markdown
207
+ card.innerHTML = `
208
+ <div class="conversation">
209
+ <div class="message user">
210
+ ${attachmentHTML}
211
+ <div class="message-text markdown-body">${marked.parse(message.userText)}</div>
212
+ </div>
213
+ <div class="message ai">
214
+ <div class="message-text markdown-body">${marked.parse(message.aiResponse)}</div>
215
+ <div class="ai-meta">
216
+ <span class="ai-model">${message.model}</span>
217
+ <span class="ai-timestamp"> @${formatTimestamp(message.timestamp)}</span>
218
+ </div>
219
+ </div>
220
+ </div>
221
+ `;
222
+ carousel.appendChild(card);
223
+ processMessagesInContainer(card);
224
+ });
225
+ currentCardIndex = session.messages.length > 0 ? session.messages.length - 1 : 0;
226
+ updateCarousel();
227
+ updateLayout();
228
+ document.getElementById('chatTitle').textContent = session.title;
229
+
230
+ // Re-run syntax highlighting for any new code blocks
231
+ document.querySelectorAll('.markdown-body pre code').forEach((block) => {
232
+ hljs.highlightElement(block);
233
+ });
234
+
235
+ // Add copy buttons to each code block
236
+ addCopyButtons();
237
+ }
238
+
239
+ function updateCarousel() {
240
+ if (!isTraditionalLayout) {
241
+ const cards = document.querySelectorAll('.card');
242
+ carousel.style.transform = `translateX(-${currentCardIndex * 100}%)`;
243
+ }
244
+ updateTurnLabel(sessionMessagesCount());
245
+ }
246
+ function sessionMessagesCount() {
247
+ return sessions[currentSessionIndex].messages.length;
248
+ }
249
+ function updateTurnLabel(totalCards) {
250
+ const turnLabel = document.getElementById('turnLabel');
251
+ if (isTraditionalLayout) {
252
+ turnLabel.textContent = `Turn: ${totalCards} / ${totalCards}`;
253
+ } else {
254
+ turnLabel.textContent = totalCards ? `Turn: ${currentCardIndex + 1} / ${totalCards}` : "Turn: 0 / 0";
255
+ }
256
+ }
257
+ // ----------------- Message Processing -----------------
258
+ function processMessage(messageEl) {
259
+ if (messageEl.dataset.processed) return;
260
+ const textEl = messageEl.querySelector('.message-text');
261
+ // Only add "Read more" toggle for user messages if text is long.
262
+ if (messageEl.classList.contains('user') && textEl.scrollHeight > 80) {
263
+ const toggleBtn = document.createElement('button');
264
+ toggleBtn.className = 'toggle-btn';
265
+ toggleBtn.textContent = 'Read more';
266
+ toggleBtn.addEventListener('click', function () {
267
+ if (messageEl.classList.contains('expanded')) {
268
+ messageEl.classList.remove('expanded');
269
+ toggleBtn.textContent = 'Read more';
270
+ } else {
271
+ messageEl.classList.add('expanded');
272
+ toggleBtn.textContent = 'Read less';
273
+ }
274
+ });
275
+ messageEl.appendChild(toggleBtn);
276
+ }
277
+ messageEl.dataset.processed = 'true';
278
+ }
279
+ function processMessagesInContainer(container) {
280
+ container.querySelectorAll('.message').forEach(processMessage);
281
+ }
282
+ // ----------------- Adding Conversation & Stream API Call -----------------
283
+ async function fileToBase64(file) {
284
+ return new Promise((resolve, reject) => {
285
+ const reader = new FileReader();
286
+ reader.onload = () => {
287
+ // Get the base64 string (remove the data URL prefix)
288
+ const base64 = reader.result.split(',')[1];
289
+ resolve({
290
+ name: file.name,
291
+ path: file.webkitRelativePath || file.path || file.name,
292
+ size: file.size,
293
+ type: file.type,
294
+ content: base64
295
+ });
296
+ };
297
+ reader.onerror = reject;
298
+ reader.readAsDataURL(file);
299
+ });
300
+ }
301
+
302
+ const attachedFiles = [];
303
+
304
+ async function addConversation(userText) {
305
+ if (userText.trim() === '' && attachedFiles.length === 0) return;
306
+
307
+ sessions[currentSessionIndex].messages.push({
308
+ userText,
309
+ aiResponse: "",
310
+ attachments: await Promise.all(attachedFiles.map(fileToBase64)),
311
+ model: sessions[currentSessionIndex].settings.model,
312
+ timestamp: new Date().toISOString(), // Store the timestamp as ISO string
313
+ maxTokens: sessions[currentSessionIndex].settings.maxTokens,
314
+ temperature: sessions[currentSessionIndex].settings.temperature,
315
+ persona: sessions[currentSessionIndex].settings.persona,
316
+ sessionId: sessions[currentSessionIndex].id
317
+ });
318
+
319
+ clearFileAttachments();
320
+ renderCurrentSession();
321
+
322
+ const conversation = [];
323
+ sessions[currentSessionIndex].messages.forEach(msg => {
324
+ conversation.push({ role: "user", content: msg.userText, attachments: msg.attachments, sessionId: msg.sessionId });
325
+ if (msg.aiResponse) {
326
+ conversation.push({ role: "assistant", content: msg.aiResponse, sessionId: msg.sessionId });
327
+ }
328
+ });
329
+
330
+ // Set up the AbortController for streaming
331
+ currentStreamController = new AbortController();
332
+ isStreaming = true;
333
+ // Change the send button icon to a stop button icon
334
+ sendBtn.innerHTML = `<img src="assets/stop.svg" alt="Stop Icon" class="svg-icon-non-white">`;
335
+
336
+ try {
337
+ // Pass the abort signal to your streaming function
338
+ const aiResponse = await callLLMStream(conversation, currentStreamController.signal);
339
+ sessions[currentSessionIndex].messages[sessions[currentSessionIndex].messages.length - 1].aiResponse = aiResponse;
340
+ renderCurrentSession();
341
+ // If summarization is enabled, call it after the stream is complete
342
+ if (sessions[currentSessionIndex].settings.enableSummarization) {
343
+ const session = sessions[currentSessionIndex];
344
+ await callLLMSummaryBatch(
345
+ session.id,
346
+ sessions[currentSessionIndex].messages,
347
+ session.settings.model,
348
+ session.settings.temperature,
349
+ session.settings.maxTokens
350
+ );
351
+ }
352
+ } catch (err) {
353
+ if (err.name === 'AbortError') {
354
+ console.log('Streaming aborted by user.');
355
+ } else {
356
+ console.error(err);
357
+ sessions[currentSessionIndex].messages[sessions[currentSessionIndex].messages.length - 1].aiResponse = "Error: " + err.message;
358
+ renderCurrentSession();
359
+ }
360
+ } finally {
361
+ // Revert the button icon to send after streaming stops/aborts
362
+ sendBtn.innerHTML = `<img src="assets/send.svg" alt="Send Icon" class="svg-icon-non-white">`;
363
+ isStreaming = false;
364
+ }
365
+ }
366
+
367
+ function clearFileAttachments() {
368
+ attachedFiles.length = 0;
369
+ updateFileAttachments();
370
+ }
371
+ // ----------------- Auto-resize Textarea -----------------
372
+ const chatInput = document.getElementById('chatInput');
373
+ chatInput.addEventListener('input', function () {
374
+ this.style.height = 'auto';
375
+ this.style.height = this.scrollHeight + 'px';
376
+ });
377
+ function resetTextarea() {
378
+ chatInput.style.height = '36px';
379
+ }
380
+ // ----------------- Send Message -----------------
381
+ const sendBtn = document.getElementById('sendBtn');
382
+ sendBtn.addEventListener('click', async () => {
383
+ // If streaming is already in progress, stop it.
384
+ if (isStreaming) {
385
+ if (currentStreamController) {
386
+ currentStreamController.abort(); // Abort the streaming fetch
387
+ }
388
+ // Remove blinking cursor from the last message:
389
+ const session = sessions[currentSessionIndex];
390
+ const lastIndex = session.messages.length - 1;
391
+ const currentContent = session.messages[lastIndex].aiResponse;
392
+ // Remove the blinking cursor element from the string if present
393
+ const cleanedContent = currentContent.replace(`<span class="blinking-cursor"></span>`, '');
394
+ updateLastMessage(cleanedContent, false);
395
+
396
+ // Revert the button back to the send icon
397
+ sendBtn.innerHTML = `<img src="assets/send.svg" alt="Send Icon" class="svg-icon-non-white">`;
398
+ isStreaming = false;
399
+ return;
400
+ }
401
+
402
+ // Otherwise, start sending the message
403
+ const text = chatInput.value;
404
+ if (text.trim() !== '') {
405
+ await addConversation(text);
406
+ chatInput.value = '';
407
+ resetTextarea();
408
+ }
409
+ });
410
+
411
+
412
+ chatInput.addEventListener('keydown', function (e) {
413
+ if (e.key === 'Enter' && (e.ctrlKey || e.metaKey)) {
414
+ e.preventDefault();
415
+ sendBtn.click();
416
+ }
417
+ });
418
+ const prevBtn = document.getElementById('prevBtn');
419
+ const nextBtn = document.getElementById('nextBtn');
420
+ prevBtn.addEventListener('click', () => {
421
+ if (currentCardIndex > 0) {
422
+ currentCardIndex--;
423
+ updateCarousel();
424
+ }
425
+ });
426
+ nextBtn.addEventListener('click', () => {
427
+ const cards = document.querySelectorAll('.card');
428
+ if (currentCardIndex < cards.length - 1) {
429
+ currentCardIndex++;
430
+ updateCarousel();
431
+ }
432
+ });
433
+ // ----------------- File Attachment Handling -----------------
434
+ const attachBtn = document.getElementById('attachBtn');
435
+ const fileInput = document.getElementById('fileInput');
436
+ const fileAttachments = document.getElementById('fileAttachments');
437
+ attachBtn.addEventListener('click', () => {
438
+ fileInput.click();
439
+ });
440
+ fileInput.addEventListener('change', () => {
441
+ for (const file of fileInput.files) {
442
+ attachedFiles.push(file);
443
+ // Display the file path in the console
444
+ }
445
+ fileInput.value = "";
446
+ updateFileAttachments();
447
+ });
448
+
449
+ function updateFileAttachments() {
450
+ fileAttachments.innerHTML = "";
451
+ attachedFiles.forEach((file, index) => {
452
+ const fileDiv = document.createElement("div");
453
+ fileDiv.className = "file-item";
454
+ fileDiv.innerHTML = `<span>${file.name}</span> <button data-index="${index}">&times;</button>`;
455
+ fileAttachments.appendChild(fileDiv);
456
+ });
457
+ document.querySelectorAll(".file-item button").forEach(btn => {
458
+ btn.addEventListener("click", (e) => {
459
+ const idx = e.target.getAttribute("data-index");
460
+ attachedFiles.splice(idx, 1);
461
+ updateFileAttachments();
462
+ });
463
+ });
464
+ }
465
+ // ----------------- Summary Overlay -----------------
466
+ const summaryOverlay = document.getElementById('summaryOverlay');
467
+ const closeSummaryBtn = document.getElementById('closeSummaryBtn');
468
+ const summaryContent = document.getElementById('summaryContent');
469
+ const downloadSummaryBtn = document.getElementById('downloadSummary');
470
+ closeSummaryBtn.addEventListener('click', () => {
471
+ summaryOverlay.classList.remove('active');
472
+ });
473
+ downloadSummaryBtn.addEventListener('click', () => {
474
+ const blob = new Blob([sessions[currentSessionIndex].summary], { type: "text/markdown" });
475
+ const url = URL.createObjectURL(blob);
476
+ const a = document.createElement("a");
477
+ a.href = url;
478
+ a.download = "summary.md";
479
+ a.click();
480
+ URL.revokeObjectURL(url);
481
+ });
482
+ // ----------------- Settings Overlay -----------------
483
+ const settingsOverlay = document.getElementById('settingsOverlay');
484
+ const closeSettingsBtn = document.getElementById('closeSettingsBtn');
485
+ const temperatureInput = document.getElementById('temperature');
486
+ const temperatureValue = document.getElementById('temperatureValue');
487
+ const maxTokensInput = document.getElementById('maxTokens');
488
+ const personaSelect = document.getElementById('persona');
489
+ const saveSettingsBtn = document.getElementById('saveSettings');
490
+ closeSettingsBtn.addEventListener('click', () => {
491
+ settingsOverlay.classList.remove('active');
492
+ });
493
+ temperatureInput.addEventListener('input', () => {
494
+ temperatureValue.textContent = temperatureInput.value;
495
+ });
496
+ saveSettingsBtn.addEventListener('click', () => {
497
+ const sessionSettings = sessions[currentSessionIndex].settings;
498
+ sessionSettings.temperature = parseFloat(temperatureInput.value);
499
+ sessionSettings.maxTokens = parseInt(maxTokensInput.value);
500
+ sessionSettings.persona = personaSelect.value;
501
+ sessionSettings.model = modelSelect.value;
502
+ // Read the summarization toggle value
503
+ sessionSettings.enableSummarization = document.getElementById('toggleSummarization').checked;
504
+
505
+ console.log('Session settings saved:', sessions[currentSessionIndex].settings);
506
+ settingsOverlay.classList.remove('active');
507
+ });
508
+
509
+ // ----------------- Title Editing -----------------
510
+ const editTitleBtn = document.getElementById('editTitleBtn');
511
+ editTitleBtn.addEventListener('click', () => {
512
+ const currentTitle = sessions[currentSessionIndex].title;
513
+ const newTitle = prompt("Enter new chat title:", currentTitle);
514
+ if (newTitle !== null && newTitle.trim() !== "") {
515
+ sessions[currentSessionIndex].title = newTitle.trim();
516
+ document.getElementById('chatTitle').textContent = newTitle.trim();
517
+ }
518
+ });
519
+ // ----------------- Auto-Dismiss Overlays -----------------
520
+ document.addEventListener('click', (e) => {
521
+ if (summaryOverlay.classList.contains('active') && !summaryOverlay.contains(e.target)) {
522
+ summaryOverlay.classList.remove('active');
523
+ }
524
+ if (settingsOverlay.classList.contains('active') && !settingsOverlay.contains(e.target)) {
525
+ settingsOverlay.classList.remove('active');
526
+ }
527
+ });
528
+ // ----------------- Global Keyboard Navigation -----------------
529
+ document.addEventListener('keydown', (e) => {
530
+ if (document.activeElement !== chatInput) {
531
+ if (e.key === 'ArrowLeft' && currentCardIndex > 0) {
532
+ currentCardIndex--;
533
+ updateCarousel();
534
+ } else if (e.key === 'ArrowRight') {
535
+ const cards = document.querySelectorAll('.card');
536
+ if (currentCardIndex < cards.length - 1) {
537
+ currentCardIndex++;
538
+ updateCarousel();
539
+ }
540
+ }
541
+ }
542
+ });
543
+ // ----------------- Hamburger Toggle -----------------
544
+ const hamburgerBtn = document.getElementById('hamburgerBtn');
545
+ const navBar = document.getElementById('navBar');
546
+ hamburgerBtn.addEventListener('click', (e) => {
547
+ e.stopPropagation();
548
+ navBar.classList.toggle('hidden');
549
+ updateHamburgerPosition();
550
+ });
551
+
552
+ // ----------------- Custom Button Event Listeners -----------------
553
+ const customBtn1 = document.getElementById('customBtn1');
554
+ const customBtn2 = document.getElementById('customBtn2');
555
+
556
+ customBtn1.addEventListener('click', (e) => {
557
+ e.stopPropagation();
558
+ // Open the summary overlay for the current session
559
+ document.getElementById('summaryContent').innerHTML = marked.parse(sessions[currentSessionIndex].summary);
560
+ summaryOverlay.classList.add('active');
561
+ settingsOverlay.classList.remove('active');
562
+ });
563
+
564
+ customBtn2.addEventListener('click', (e) => {
565
+ e.stopPropagation();
566
+ // Open the settings overlay for the current session and fill in the fields
567
+ const settings = sessions[currentSessionIndex].settings;
568
+ temperatureInput.value = settings.temperature;
569
+ temperatureValue.textContent = settings.temperature;
570
+ maxTokensInput.value = settings.maxTokens;
571
+ personaSelect.value = settings.persona;
572
+ settingsOverlay.classList.add('active');
573
+ summaryOverlay.classList.remove('active');
574
+ });
575
+
576
+ // Get reference to the new select element
577
+ const modelSelect = document.getElementById('modelSelect');
578
+
579
+ function openSettingsForCurrentSession() {
580
+ const settings = sessions[currentSessionIndex].settings;
581
+ // Existing lines for temperature, maxTokens, persona...
582
+ modelSelect.value = settings.model; // Populate the dropdown with the current model
583
+ }
584
+
585
+ // When opening the settings overlay:
586
+ customBtn2.addEventListener('click', (e) => {
587
+ e.stopPropagation();
588
+ // ...
589
+ openSettingsForCurrentSession(); // load session settings into the UI
590
+ settingsOverlay.classList.add('active');
591
+ summaryOverlay.classList.remove('active');
592
+ });
593
+
594
+ // Saving:
595
+ saveSettingsBtn.addEventListener('click', () => {
596
+ const sessionSettings = sessions[currentSessionIndex].settings;
597
+ // Existing lines for temperature, maxTokens, persona...
598
+ sessionSettings.model = modelSelect.value; // Save the selected model
599
+
600
+ console.log('Session settings saved:', sessions[currentSessionIndex].settings);
601
+ settingsOverlay.classList.remove('active');
602
+ });
603
+
604
+ async function callLLMStream(conversation, signal) {
605
+ const session = sessions[currentSessionIndex];
606
+ const { model, temperature, maxTokens } = session.settings;
607
+
608
+ if (model.startsWith("gpt-4o")) {
609
+ // Call OpenAI endpoint
610
+ return callOpenAIStream(session.id, conversation, signal);
611
+ } else if (model.startsWith("claude")) {
612
+ // Call Anthropic endpoint
613
+ return callAnthropicStream(session.id, conversation, signal);
614
+ } else if (model.startsWith("gemini")) {
615
+ // Call Google endpoint
616
+ return callGoogleStream(session.id, conversation, signal);
617
+ } else if (model.startsWith("huggingface")) {
618
+ // Call Hugging Face endpoint
619
+ return callHuggingFaceStream(session.id, conversation, model.replace("huggingface/", ""), signal);
620
+ } else if (model.startsWith("mistral")) {
621
+ // Call Mistral endpoint
622
+ return callMistralStream(session.id, conversation, model, signal);
623
+ } else {
624
+ throw new Error("Unsupported model: " + model);
625
+ }
626
+ }
627
+
628
+ async function callLLMSummaryBatch(sessionId, conversation, model, temperature, maxTokens) {
629
+ const loadingOverlay = document.getElementById("loadingOverlay");
630
+ loadingOverlay.classList.add("active");
631
+
632
+ let endpoint = "";
633
+ if (model.startsWith("gpt-4o")) {
634
+ endpoint = "http://127.0.0.1:8000/openai_summary";
635
+ } else if (model.startsWith("claude")) {
636
+ endpoint = "http://127.0.0.1:8000/anthropic_summary";
637
+ } else if (model.startsWith("gemini")) {
638
+ endpoint = "http://127.0.0.1:8000/gemini_summary";
639
+ } else if (model.startsWith("huggingface")) {
640
+ endpoint = "http://127.0.0.1:8000/huggingface_summary";
641
+ } else if (model.startsWith("mistral")) {
642
+ endpoint = "http://127.0.0.1:8000/mistral_summary";
643
+ } else {
644
+ throw new Error("Unsupported model for summary: " + model);
645
+ }
646
+
647
+ const response = await fetch(endpoint, {
648
+ method: "POST",
649
+ headers: {
650
+ "Content-Type": "application/json",
651
+ "X-Session-ID": sessionId
652
+ },
653
+ body: JSON.stringify({
654
+ conversation: conversation,
655
+ temperature: temperature,
656
+ max_tokens: maxTokens,
657
+ model: model,
658
+ })
659
+ });
660
+
661
+ // Since the summary endpoint returns batch data,
662
+ // wait for the full response text.
663
+ const responseData = await response.json();
664
+ const summaryText = responseData.summary;
665
+ sessions[currentSessionIndex].summary = summaryText;
666
+
667
+ // Optionally update the summary overlay if it is open
668
+ const summaryOverlay = document.getElementById('summaryOverlay');
669
+ if (summaryOverlay.classList.contains('active')) {
670
+ document.getElementById('summaryContent').innerHTML = marked.parse(summaryText);
671
+ }
672
+
673
+ loadingOverlay.classList.remove("active");
674
+
675
+ return summaryText;
676
+ }
677
+
678
+
679
+ async function callMistralStream(sessionId, conversation, signal) {
680
+ console.log(`Calling Mistral API with model: ${model}`);
681
+ const response = await fetch("http://127.0.0.1:8000/mistral_stream", {
682
+ method: "POST",
683
+ headers: {
684
+ "Content-Type": "application/json",
685
+ "X-Session-ID": sessionId
686
+ },
687
+ body: JSON.stringify({
688
+ conversation: conversation,
689
+ temperature: sessions[currentSessionIndex].settings.temperature,
690
+ max_tokens: sessions[currentSessionIndex].settings.maxTokens,
691
+ model: sessions[currentSessionIndex].settings.model,
692
+ }),
693
+ signal: signal
694
+ });
695
+
696
+ const reader = response.body.getReader();
697
+ const decoder = new TextDecoder("utf-8");
698
+ let done = false;
699
+ let aiMessage = "";
700
+
701
+ updateLastMessage(aiMessage, true);
702
+ while (!done) {
703
+ const { value, done: doneReading } = await reader.read();
704
+ done = doneReading;
705
+ const chunk = decoder.decode(value);
706
+ const lines = chunk.split("\n").filter(line => line.trim().startsWith("data:"));
707
+ for (const line of lines) {
708
+ const dataStr = line.replace(/^data:\s*/, "");
709
+ if (dataStr === "[DONE]") {
710
+ done = true;
711
+ break;
712
+ }
713
+ try {
714
+ const parsed = JSON.parse(dataStr);
715
+ const delta = parsed.choices[0].delta.content;
716
+ if (delta) {
717
+ aiMessage += delta;
718
+ updateLastMessage(aiMessage, true);
719
+ }
720
+ } catch (err) {
721
+ console.error("Mistral stream parsing error:", err);
722
+ }
723
+ }
724
+ }
725
+ updateLastMessage(aiMessage, false);
726
+ return aiMessage;
727
+ }
728
+
729
+
730
+ async function callOpenAIStream(sessionId, conversation, signal) {
731
+ const response = await fetch("http://127.0.0.1:8000/openai_stream", {
732
+ method: "POST",
733
+ headers: {
734
+ "Content-Type": "application/json",
735
+ "X-Session-ID": sessionId
736
+ // Remove the Authorization header since the Python backend handles the API key.
737
+ },
738
+ body: JSON.stringify({
739
+ conversation: conversation,
740
+ temperature: sessions[currentSessionIndex].settings.temperature,
741
+ max_tokens: sessions[currentSessionIndex].settings.maxTokens,
742
+ model: sessions[currentSessionIndex].settings.model,
743
+ }),
744
+ signal: signal
745
+ });
746
+ const reader = response.body.getReader();
747
+ const decoder = new TextDecoder("utf-8");
748
+ let done = false;
749
+ let aiMessage = "";
750
+
751
+ updateLastMessage(aiMessage, true);
752
+ while (!done) {
753
+ const { value, done: doneReading } = await reader.read();
754
+ done = doneReading;
755
+ const chunk = decoder.decode(value);
756
+ const lines = chunk.split("\n").filter(line => line.trim().startsWith("data:"));
757
+ for (const line of lines) {
758
+ const dataStr = line.replace(/^data:\s*/, "");
759
+ if (dataStr === "[DONE]") {
760
+ done = true;
761
+ break;
762
+ }
763
+ try {
764
+ // Parse the JSON returned by the Python backend.
765
+ const parsed = JSON.parse(dataStr);
766
+ // Assuming the payload structure is the same as OpenAI's response.
767
+ const delta = parsed.choices[0].delta.content;
768
+ if (delta) {
769
+ aiMessage += delta;
770
+ updateLastMessage(aiMessage, true);
771
+ }
772
+ } catch (err) {
773
+ console.error("Stream parsing error:", err);
774
+ }
775
+ }
776
+ }
777
+ updateLastMessage(aiMessage, false);
778
+ return aiMessage;
779
+ }
780
+
781
+
782
+ async function callAnthropicStream(sessionId, conversation, signal) {
783
+ model = model.toLowerCase().replace(/\s+/g, '-').replace(/\./g, '-');
784
+ console.log(`Calling Anthropic API with model: ${model}`);
785
+
786
+ const response = await fetch("http://127.0.0.1:8000/anthropic_stream", {
787
+ method: "POST",
788
+ headers: {
789
+ "Content-Type": "application/json",
790
+ "X-Session-ID": sessionId
791
+ },
792
+ body: JSON.stringify({
793
+ messages: conversation,
794
+ temperature: sessions[currentSessionIndex].settings.temperature,
795
+ max_tokens: sessions[currentSessionIndex].settings.maxTokens,
796
+ model: sessions[currentSessionIndex].settings.model + "-latest",
797
+ }),
798
+ signal: signal
799
+ });
800
+
801
+ const reader = response.body.getReader();
802
+ const decoder = new TextDecoder("utf-8");
803
+ let done = false;
804
+ let aiMessage = "";
805
+
806
+ updateLastMessage(aiMessage, true);
807
+ while (!done) {
808
+ const { value, done: doneReading } = await reader.read();
809
+ done = doneReading;
810
+ const chunk = decoder.decode(value);
811
+ const lines = chunk.split("\n").filter(line => line.trim().startsWith("data:"));
812
+
813
+ for (const line of lines) {
814
+ const dataStr = line.replace(/^data:\s*/, "");
815
+ if (dataStr === "[DONE]") {
816
+ done = true;
817
+ break;
818
+ }
819
+
820
+ try {
821
+ const parsed = JSON.parse(dataStr);
822
+ const delta = parsed.choices[0].delta.content;
823
+ if (delta) {
824
+ aiMessage += delta;
825
+ updateLastMessage(aiMessage, true);
826
+ }
827
+ } catch (err) {
828
+ console.error("Anthropic stream parsing error:", err);
829
+ }
830
+ }
831
+ }
832
+ updateLastMessage(aiMessage, false);
833
+ return aiMessage;
834
+
835
+ }
836
+
837
+ async function callGoogleStream(sessionId, conversation, signal) {
838
+ const response = await fetch("http://127.0.0.1:8000/gemini_stream", {
839
+ method: "POST",
840
+ headers: {
841
+ "Content-Type": "application/json",
842
+ "X-Session-ID": sessionId
843
+ },
844
+ body: JSON.stringify({
845
+ messages: conversation,
846
+ temperature: sessions[currentSessionIndex].settings.temperature,
847
+ max_tokens: sessions[currentSessionIndex].settings.maxTokens,
848
+ model: sessions[currentSessionIndex].settings.model.toLowerCase().replace(/\s+/g, '-')
849
+ }),
850
+ signal: signal
851
+ });
852
+
853
+ const reader = response.body.getReader();
854
+ const decoder = new TextDecoder("utf-8");
855
+ let done = false;
856
+ let aiMessage = "";
857
+
858
+ updateLastMessage(aiMessage, true);
859
+ while (!done) {
860
+ const { value, done: doneReading } = await reader.read();
861
+ done = doneReading;
862
+ const chunk = decoder.decode(value);
863
+ const lines = chunk.split("\n").filter(line => line.trim().startsWith("data:"));
864
+
865
+ for (const line of lines) {
866
+ const dataStr = line.replace(/^data:\s*/, "");
867
+ if (dataStr === "[DONE]") {
868
+ done = true;
869
+ break;
870
+ }
871
+
872
+ try {
873
+ const parsed = JSON.parse(dataStr);
874
+ const delta = parsed.choices[0].delta.content;
875
+ if (delta) {
876
+ aiMessage += delta;
877
+ updateLastMessage(aiMessage, true);
878
+ }
879
+ } catch (err) {
880
+ console.error("Gemini stream parsing error:", err);
881
+ }
882
+ }
883
+ }
884
+ updateLastMessage(aiMessage, false);
885
+ return aiMessage;
886
+ }
887
+
888
+ async function callHuggingFaceStream(sessionId, conversation, signal) {
889
+ console.log(`Calling Hugging Face API with model: ${model}`);
890
+ const response = await fetch("http://127.0.0.1:8000/huggingface_stream", {
891
+ method: "POST",
892
+ headers: {
893
+ "Content-Type": "application/json",
894
+ "X-Session-ID": sessionId
895
+ },
896
+ body: JSON.stringify({
897
+ messages: conversation,
898
+ temperature: sessions[currentSessionIndex].settings.temperature,
899
+ max_tokens: sessions[currentSessionIndex].settings.maxTokens,
900
+ model: sessions[currentSessionIndex].settings.model,
901
+ }),
902
+ signal: signal
903
+ });
904
+
905
+ const reader = response.body.getReader();
906
+ const decoder = new TextDecoder("utf-8");
907
+ let done = false;
908
+ let aiMessage = "";
909
+
910
+ updateLastMessage(aiMessage, true);
911
+ while (!done) {
912
+ const { value, done: doneReading } = await reader.read();
913
+ done = doneReading;
914
+ const chunk = decoder.decode(value);
915
+ const lines = chunk.split("\n").filter(line => line.trim().startsWith("data:"));
916
+
917
+ for (const line of lines) {
918
+ const dataStr = line.replace(/^data:\s*/, "");
919
+ if (dataStr === "[DONE]") {
920
+ done = true;
921
+ break;
922
+ }
923
+
924
+ try {
925
+ const parsed = JSON.parse(dataStr);
926
+ const delta = parsed.choices[0].delta.content;
927
+ if (delta) {
928
+ aiMessage += delta;
929
+ updateLastMessage(aiMessage, true);
930
+ }
931
+ } catch (err) {
932
+ console.error("Hugging Face stream parsing error:", err);
933
+ }
934
+ }
935
+ }
936
+ updateLastMessage(aiMessage, false);
937
+ return aiMessage;
938
+ }
939
+
940
+ // ----------------- Initialization -----------------
941
+ initSessions();
942
+ updateHamburgerPosition();
standalone/front/scripts/sessions.js ADDED
@@ -0,0 +1,227 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // sessions.js
2
+ import { formatTimestamp } from './utils.js';
3
+ import { updateLayout } from './navigation.js';
4
+
5
+ export let sessions = [];
6
+ export let currentSessionIndex = 0;
7
+ export let currentCardIndex = 0;
8
+
9
+ const summarizeToggleBtn = document.getElementById('customBtn3');
10
+
11
+ export function initSessions() {
12
+ sessions.push({
13
+ id: Date.now() + '-' + Math.random().toString(36).substr(2, 9),
14
+ name: "Chat Session 1",
15
+ title: "Chat Session 1",
16
+ messages: [],
17
+ summary: "# Chat Summary\n\nThis is the default summary for Chat Session 1.",
18
+ settings: {
19
+ temperature: 0.7,
20
+ maxTokens: 8096,
21
+ persona: "professional",
22
+ model: "gpt-4o-mini",
23
+ enableSummarization: false,
24
+ },
25
+ });
26
+ currentSessionIndex = 0;
27
+ currentCardIndex = 0;
28
+ renderSessionList();
29
+ renderCurrentSession();
30
+
31
+ summarizeToggleBtn.classList.remove('active');
32
+ }
33
+
34
+ export function renderSessionList() {
35
+ const sessionList = document.getElementById('sessionList');
36
+ sessionList.innerHTML = "";
37
+ sessions.forEach((session, index) => {
38
+ const li = document.createElement('li');
39
+ const nameSpan = document.createElement('span');
40
+ nameSpan.textContent = session.name;
41
+ li.appendChild(nameSpan);
42
+ const removeBtn = document.createElement('button');
43
+ removeBtn.textContent = "𝘅";
44
+ removeBtn.className = "remove-session";
45
+ removeBtn.addEventListener('click', (e) => {
46
+ e.stopPropagation();
47
+ removeSession(index);
48
+ });
49
+ li.appendChild(removeBtn);
50
+ li.addEventListener('click', () => {
51
+ currentSessionIndex = index;
52
+ currentCardIndex = 0;
53
+ renderSessionList();
54
+ renderCurrentSession();
55
+ });
56
+ if (index === currentSessionIndex) li.classList.add('active');
57
+ sessionList.appendChild(li);
58
+ });
59
+ }
60
+
61
+ export function removeSession(index) {
62
+ sessions.splice(index, 1);
63
+ if (sessions.length === 0) {
64
+ initSessions();
65
+ } else {
66
+ if (currentSessionIndex >= sessions.length) {
67
+ currentSessionIndex = sessions.length - 1;
68
+ }
69
+ currentCardIndex = 0;
70
+ }
71
+ renderSessionList();
72
+ renderCurrentSession();
73
+ }
74
+
75
+ export function renderCurrentSession() {
76
+ const session = sessions[currentSessionIndex];
77
+ const carousel = document.getElementById('carousel');
78
+ carousel.innerHTML = "";
79
+ session.messages.forEach(message => {
80
+ const card = document.createElement('div');
81
+ card.className = 'card';
82
+ let attachmentHTML = "";
83
+ if (message.attachments && message.attachments.length > 0) {
84
+ attachmentHTML = `
85
+ <div class="vertical-file-list">
86
+ ${message.attachments.map(file => `<div class="file-item-vertical">${file.path}</div>`).join("")}
87
+ </div>
88
+ `;
89
+ }
90
+ card.innerHTML = `
91
+ <div class="conversation">
92
+ <div class="message user">
93
+ ${attachmentHTML}
94
+ <div class="message-text markdown-body">${marked.parse(message.userText)}</div>
95
+ </div>
96
+ <div class="message ai">
97
+ <div class="message-text markdown-body">${marked.parse(message.aiResponse)}</div>
98
+ <div class="ai-meta">
99
+ <span class="ai-model">${message.model}</span>
100
+ <span class="ai-timestamp"> @${formatTimestamp(message.timestamp)}</span>
101
+ </div>
102
+ </div>
103
+ </div>
104
+ `;
105
+ carousel.appendChild(card);
106
+ processMessagesInContainer(card);
107
+ });
108
+ currentCardIndex = session.messages.length > 0 ? session.messages.length - 1 : 0;
109
+ updateCarousel();
110
+ updateLayout();
111
+ document.getElementById('chatTitle').textContent = session.title;
112
+
113
+ // Re-highlight code blocks and add copy buttons.
114
+ document.querySelectorAll('.markdown-body pre code').forEach((block) => {
115
+ hljs.highlightElement(block);
116
+ });
117
+ addCopyButtons();
118
+
119
+ summarizeToggleBtn.classList.toggle('active', session.settings.enableSummarization);
120
+ }
121
+
122
+ export function updateCarousel() {
123
+ const carousel = document.getElementById('carousel');
124
+ if (!window.isTraditionalLayout) {
125
+ const cards = document.querySelectorAll('.card');
126
+ carousel.style.transform = `translateX(-${currentCardIndex * 100}%)`;
127
+ }
128
+ updateTurnLabel(sessionMessagesCount());
129
+ }
130
+
131
+ export function getCurrentCardIndex() {
132
+ return currentCardIndex;
133
+ }
134
+
135
+ export function setCurrentCardIndex(newIndex) {
136
+ currentCardIndex = newIndex;
137
+ updateCarousel();
138
+ }
139
+
140
+ export function sessionMessagesCount() {
141
+ return sessions[currentSessionIndex].messages.length;
142
+ }
143
+
144
+ export function updateTurnLabel(totalCards) {
145
+ const turnLabel = document.getElementById('turnLabel');
146
+ if (window.isTraditionalLayout) {
147
+ turnLabel.textContent = `Turn: ${totalCards} / ${totalCards}`;
148
+ } else {
149
+ turnLabel.textContent = totalCards ? `Turn: ${currentCardIndex + 1} / ${totalCards}` : "Turn: 0 / 0";
150
+ }
151
+ }
152
+
153
+ export function processMessage(messageEl) {
154
+ if (messageEl.dataset.processed) return;
155
+ const textEl = messageEl.querySelector('.message-text');
156
+ if (messageEl.classList.contains('user') && textEl.scrollHeight > 80) {
157
+ const toggleBtn = document.createElement('button');
158
+ toggleBtn.className = 'toggle-btn';
159
+ toggleBtn.textContent = 'Read more';
160
+ toggleBtn.addEventListener('click', function () {
161
+ if (messageEl.classList.contains('expanded')) {
162
+ messageEl.classList.remove('expanded');
163
+ toggleBtn.textContent = 'Read more';
164
+ } else {
165
+ messageEl.classList.add('expanded');
166
+ toggleBtn.textContent = 'Read less';
167
+ }
168
+ });
169
+ messageEl.appendChild(toggleBtn);
170
+ }
171
+ messageEl.dataset.processed = 'true';
172
+ }
173
+
174
+ export function processMessagesInContainer(container) {
175
+ container.querySelectorAll('.message').forEach(processMessage);
176
+ }
177
+
178
+ /**
179
+ * Adds copy buttons to code blocks in rendered markdown.
180
+ */
181
+ export function addCopyButtons() {
182
+ document.querySelectorAll('.markdown-body pre').forEach((pre) => {
183
+ if (pre.querySelector('.copy-button')) return;
184
+ pre.style.position = "relative";
185
+ const button = document.createElement("button");
186
+ button.className = "copy-button";
187
+ button.innerText = "Copy";
188
+ pre.appendChild(button);
189
+ button.addEventListener("click", () => {
190
+ const codeText = pre.querySelector("code").innerText;
191
+ navigator.clipboard.writeText(codeText).then(() => {
192
+ button.innerText = "Copied!";
193
+ setTimeout(() => {
194
+ button.innerText = "Copy";
195
+ }, 2000);
196
+ }).catch((err) => {
197
+ console.error("Failed to copy code: ", err);
198
+ });
199
+ });
200
+ });
201
+ }
202
+
203
+ document.getElementById('newSessionBtn').addEventListener('click', () => {
204
+ const newSession = {
205
+ id: Date.now() + '-' + Math.random().toString(36).substr(2, 9),
206
+ name: "Chat Session " + (sessions.length + 1),
207
+ title: "Chat Session " + (sessions.length + 1),
208
+ messages: [],
209
+ summary: "# Chat Summary\n\nThis is the default summary for Chat Session " + (sessions.length + 1) + ".",
210
+ settings: {
211
+ temperature: 0.7,
212
+ maxTokens: 8096,
213
+ persona: "professional",
214
+ model: "gpt-4o-mini",
215
+ enableSummarization: false
216
+ }
217
+ };
218
+
219
+ sessions.push(newSession);
220
+ currentSessionIndex = sessions.length - 1;
221
+ currentCardIndex = 0;
222
+
223
+ summarizeToggleBtn.classList.remove('active');
224
+
225
+ renderSessionList();
226
+ renderCurrentSession();
227
+ });
standalone/front/scripts/utils.js ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ // utils.js
2
+
3
+ // Set marked options (for syntax highlighting)
4
+ marked.setOptions({
5
+ highlight: function (code, lang) {
6
+ if (lang && hljs.getLanguage(lang)) {
7
+ return hljs.highlight(code, { language: lang }).value;
8
+ }
9
+ return hljs.highlightAuto(code).value;
10
+ },
11
+ });
12
+
13
+ /**
14
+ * Format an ISO timestamp to a friendly date + time string.
15
+ */
16
+ export function formatTimestamp(isoString) {
17
+ const date = new Date(isoString);
18
+ return date.toLocaleString([], {
19
+ year: 'numeric',
20
+ month: 'short',
21
+ day: 'numeric',
22
+ hour: '2-digit',
23
+ minute: '2-digit',
24
+ });
25
+ }
26
+
27
+ /**
28
+ * Updates the last message in the current session.
29
+ * If isStreaming is true, a blinking cursor is appended.
30
+ */
31
+ export function updateLastMessage(session, content, isStreaming = false) {
32
+ // Update the last message in the specified session.
33
+ session.messages[session.messages.length - 1].aiResponse =
34
+ isStreaming ? content + `<span class="blinking-cursor"></span>` : content;
35
+
36
+ renderCurrentSession();
37
+
38
+ // Wait until the DOM updates, then re-highlight code blocks and adjust scroll.
39
+ requestAnimationFrame(() => {
40
+ document.querySelectorAll('pre code').forEach((block) => {
41
+ hljs.highlightElement(block);
42
+ });
43
+ const lastCard = document.querySelector('.card:last-child');
44
+ if (lastCard) {
45
+ lastCard.scrollTop = isStreaming ? lastCard.scrollHeight : lastCard.scrollTop;
46
+ }
47
+ });
48
+ }
49
+
50
+ /**
51
+ * Auto-resize a textarea based on its content.
52
+ */
53
+ export function autoResizeTextarea(textarea) {
54
+ textarea.style.height = 'auto';
55
+ textarea.style.height = textarea.scrollHeight + 'px';
56
+ }
standalone/front/styles/style.css ADDED
@@ -0,0 +1,995 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ :root {
2
+ --primary-color: #4a90e2;
3
+ --primary-dark: #4178c0;
4
+ --accent-color: #50e3c2;
5
+ --background-color: #f9f9f9;
6
+ --body-bg: #f0f2f5;
7
+ --text-color: #333;
8
+ --light-shadow: rgba(0, 0, 0, 0.1);
9
+ --nav-width: 320px;
10
+ --border-radius: 12px;
11
+ }
12
+ /* Global Styles */
13
+ * {
14
+ box-sizing: border-box;
15
+ margin: 0;
16
+ padding: 0;
17
+ }
18
+ body {
19
+ font-family: 'Poppins', sans-serif;
20
+ background: var(--body-bg);
21
+ overflow: hidden;
22
+ color: var(--text-color);
23
+ }
24
+ /* Hamburger Button */
25
+ #hamburgerBtn {
26
+ background: var(--primary-color);
27
+ color: #fff;
28
+ border: none;
29
+ padding: 5px 10px;
30
+ font-size: 1.5em;
31
+ border-radius: 6px;
32
+ cursor: pointer;
33
+ transition: background 0.3s;
34
+ }
35
+ #hamburgerBtn:hover {
36
+ background: var(--primary-dark);
37
+ }
38
+ .chat-header .chat-hamburger {
39
+ margin-top: 0;
40
+ margin-right: 10px;
41
+ }
42
+ /* Navigation Header (New placement for New Chat button) */
43
+ .nav-header {
44
+ display: flex;
45
+ align-items: center;
46
+ gap: 10px;
47
+ margin-bottom: 20px;
48
+ }
49
+ /* App Container */
50
+ .app-container {
51
+ display: flex;
52
+ height: 100vh;
53
+ width: 100vw;
54
+ transition: all 0.3s ease;
55
+ }
56
+ /* Left Navigation Bar */
57
+ .nav-bar {
58
+ width: var(--nav-width);
59
+ background: #fff;
60
+ padding: 24px;
61
+ box-shadow: 2px 0 16px var(--light-shadow);
62
+ overflow-y: auto;
63
+ transition: transform 0.3s ease, width 0.3s ease, padding 0.3s ease;
64
+ transform: translateX(0);
65
+ display: flex;
66
+ flex-direction: column;
67
+ }
68
+ .nav-bar.hidden {
69
+ transform: translateX(-100%);
70
+ width: 0;
71
+ padding: 0;
72
+ }
73
+ .nav-bar h3 {
74
+ font-size: 1.5em;
75
+ margin-bottom: 20px;
76
+ color: var(--primary-color);
77
+ }
78
+ .nav-bar ul {
79
+ list-style: none;
80
+ flex: 1;
81
+ }
82
+ .nav-bar li {
83
+ padding: 12px 16px;
84
+ margin-bottom: 12px;
85
+ background: var(--background-color);
86
+ border-radius: var(--border-radius);
87
+ cursor: pointer;
88
+ display: flex;
89
+ align-items: center;
90
+ justify-content: space-between;
91
+ transition: background 0.3s;
92
+ font-size: 1em;
93
+ }
94
+ .nav-bar li.active,
95
+ .nav-bar li:hover {
96
+ background: var(--primary-color);
97
+ color: #fff;
98
+ }
99
+ .session-actions {
100
+ display: flex;
101
+ }
102
+ .session-summary-btn,
103
+ .session-settings-btn {
104
+ background: rgba(74, 144, 226, 0.15);
105
+ border: none;
106
+ border-radius: 50%;
107
+ cursor: pointer;
108
+ display: flex;
109
+ align-items: center;
110
+ justify-content: center;
111
+ transition: transform 0.2s, background 0.3s;
112
+ color: inherit;
113
+ }
114
+ .session-summary-btn:hover,
115
+ .session-settings-btn:hover {
116
+ transform: scale(1.1);
117
+ background: rgba(74, 144, 226, 0.25);
118
+ }
119
+ .nav-bar li button.remove-session {
120
+ background: transparent;
121
+ border: none;
122
+ color: inherit;
123
+ font-size: 1.2em;
124
+ cursor: pointer;
125
+ }
126
+ .new-session-btn {
127
+ padding: 2px 5px;
128
+ background: var(--primary-color);
129
+ color: #fff;
130
+ border: none;
131
+ border-radius: var(--border-radius);
132
+ cursor: pointer;
133
+ font-size: 1.1em;
134
+ transition: background 0.3s;
135
+ border-radius: 6px;
136
+ }
137
+ .new-session-btn:hover {
138
+ background: var(--primary-dark);
139
+ }
140
+ /* Toggle Layout Button at bottom of nav-bar */
141
+ #toggleLayoutBtn {
142
+ background: var(--primary-color);
143
+ color: #fff;
144
+ border: none;
145
+ padding: 2px 5px;
146
+ font-size: 1.1em;
147
+ border-radius: 6px;
148
+ cursor: pointer;
149
+ transition: background 0.3s;
150
+ margin-top: auto;
151
+ }
152
+ #toggleLayoutBtn:hover {
153
+ background: var(--primary-dark);
154
+ }
155
+ /* Chat Wrapper */
156
+ .chat-wrapper {
157
+ flex: 1;
158
+ background: #fff;
159
+ margin: 20px;
160
+ border-radius: var(--border-radius);
161
+ box-shadow: 0 12px 40px var(--light-shadow);
162
+ display: flex;
163
+ flex-direction: column;
164
+ overflow: hidden;
165
+ position: relative;
166
+ transition: all 0.3s ease;
167
+ }
168
+ /* Chat Header */
169
+ .chat-header {
170
+ display: flex;
171
+ align-items: center;
172
+ gap: 16px;
173
+ padding: 16px 40px 16px 20px;
174
+ border-bottom: 1px solid #eee;
175
+ background: #fff;
176
+ transition: margin-left 0.3s ease;
177
+ }
178
+ /* Left portion of the chat header (hamburger + new chat if collapsed) */
179
+ .header-left {
180
+ display: flex;
181
+ align-items: center;
182
+ gap: 10px;
183
+ }
184
+ .chat-title-controls {
185
+ display: flex;
186
+ align-items: center;
187
+ gap: 10px;
188
+ flex: 1;
189
+ justify-content: space-between;
190
+ }
191
+ .chat-header h2 {
192
+ font-size: 1.5em;
193
+ color: var(--primary-color);
194
+ margin: 0;
195
+ flex-grow: 1;
196
+ white-space: nowrap;
197
+ }
198
+ .chat-header button {
199
+ background: none;
200
+ border: none;
201
+ color: white;
202
+ background-color: var(--primary-color);
203
+ font-size: 1em;
204
+ cursor: pointer;
205
+ transition: color 0.3s;
206
+ }
207
+ .chat-header button:hover {
208
+ color: var(--primary-dark);
209
+ }
210
+ #editTitleBtn {
211
+ color: var(--primary-color);
212
+ background: none;
213
+ }
214
+ /* Turn Label (now part of the header's flex layout) */
215
+ #turnLabel {
216
+ background: var(--accent-color);
217
+ color: #fff;
218
+ padding: 6px 12px;
219
+ border-radius: 20px;
220
+ font-size: 1em;
221
+ white-space: nowrap;
222
+ }
223
+ /* Carousel Wrapper */
224
+ .carousel-wrapper {
225
+ position: relative;
226
+ flex: 1;
227
+ background: var(--background-color);
228
+ transition: all 0.3s ease;
229
+ overflow: hidden;
230
+ }
231
+ .carousel-wrapper.traditional-mode {
232
+ overflow-y: auto;
233
+ }
234
+ /* Carousel */
235
+ .carousel {
236
+ display: flex;
237
+ height: 100%;
238
+ transition: transform 0.5s ease;
239
+ }
240
+ .carousel.traditional {
241
+ flex-direction: column;
242
+ height: auto;
243
+ transform: none !important;
244
+ }
245
+ /* Card */
246
+ .card {
247
+ min-width: 100%;
248
+ height: 100%;
249
+ padding: 40px 70px 40px 70px;
250
+ display: flex;
251
+ flex-direction: column;
252
+ overflow-y: auto;
253
+ transition: all 0.3s ease;
254
+ }
255
+ .carousel.traditional .card {
256
+ margin-bottom: 20px;
257
+ padding: 20px 40px;
258
+ height: auto;
259
+ }
260
+ .conversation {
261
+ display: flex;
262
+ flex-direction: column;
263
+ gap: 20px;
264
+ }
265
+ /* Message Bubbles (Order: User then AI; rendered as Markdown) */
266
+ .message {
267
+ padding: 14px 20px;
268
+ border-radius: 24px;
269
+ font-size: 1em;
270
+ line-height: 1.6;
271
+ max-width: 90%;
272
+ position: relative;
273
+ box-shadow: 0 3px 10px var(--light-shadow);
274
+ transition: background 0.3s, transform 0.3s;
275
+ }
276
+ .user {
277
+ background: #e9f1ff;
278
+ border: 1px solid #cbdffb;
279
+ align-self: flex-end;
280
+ }
281
+ .ai {
282
+ background: #fff4e6;
283
+ border: 1px solid #ffe0b2;
284
+ align-self: flex-start;
285
+ }
286
+ .message-text {
287
+ display: block;
288
+ max-height: 80px;
289
+ overflow: hidden;
290
+ transition: max-height 0.3s ease;
291
+ }
292
+ .message.expanded .message-text {
293
+ max-height: none;
294
+ }
295
+ /* Ensure full AI message text is visible */
296
+ .ai .message-text {
297
+ max-height: none !important;
298
+ overflow: visible !important;
299
+ }
300
+ .ai .ai-status {
301
+ margin-top: 8px;
302
+ padding-top: 4px;
303
+ border-top: 1px solid rgba(0, 0, 0, 0.1);
304
+ font-size: 0.7em;
305
+ color: #666;
306
+ text-align: right;
307
+ }
308
+ .toggle-btn {
309
+ background: none;
310
+ border: none;
311
+ color: var(--primary-color);
312
+ cursor: pointer;
313
+ font-size: 0.85em;
314
+ margin-top: 6px;
315
+ padding: 0;
316
+ }
317
+ .vertical-file-list {
318
+ margin-bottom: 10px;
319
+ display: flex;
320
+ flex-direction: column;
321
+ gap: 5px;
322
+ }
323
+ .file-item-vertical {
324
+ background: #0500e624;
325
+ padding: 6px 10px;
326
+ border-radius: var(--border-radius);
327
+ font-size: 0.5em;
328
+ font-style: italic;
329
+ }
330
+ /* Navigation Buttons */
331
+ .nav {
332
+ position: absolute;
333
+ top: 50%;
334
+ transform: translateY(-50%);
335
+ width: 50px;
336
+ height: 50px;
337
+ background: rgba(255, 255, 255, 0);
338
+ /* backdrop-filter: blur(4px); */
339
+ border: none;
340
+ border-radius: 50%;
341
+ display: flex;
342
+ align-items: center;
343
+ justify-content: center;
344
+ cursor: pointer;
345
+ z-index: 10;
346
+ transition: transform 0.3s, background 0.3s;
347
+ }
348
+ .nav:hover {
349
+ transform: translateY(-50%) scale(1.1);
350
+ background: rgba(255, 255, 255, 0.8);
351
+ }
352
+ .nav:disabled {
353
+ opacity: 0.5;
354
+ cursor: default;
355
+ }
356
+ #prevBtn {
357
+ left: 1px;
358
+ }
359
+ #nextBtn {
360
+ right: 1px;
361
+ }
362
+ .carousel.traditional ~ .nav {
363
+ display: none;
364
+ }
365
+ /* Input Section */
366
+ .input-container {
367
+ padding: 20px;
368
+ background: #fafafa;
369
+ border-top: 1px solid #eee;
370
+ display: flex;
371
+ flex-direction: column;
372
+ }
373
+ .file-attachments {
374
+ display: flex;
375
+ flex-direction: row;
376
+ gap: 12px;
377
+ overflow-x: auto;
378
+ padding-bottom: 12px;
379
+ scrollbar-width: thin;
380
+ }
381
+ .file-attachments::-webkit-scrollbar {
382
+ height: 6px;
383
+ }
384
+ .file-attachments::-webkit-scrollbar-track {
385
+ background: #f1f1f1;
386
+ }
387
+ .file-attachments::-webkit-scrollbar-thumb {
388
+ background: #ccc;
389
+ border-radius: 3px;
390
+ }
391
+ .file-attachments::-webkit-scrollbar-thumb:hover {
392
+ background: #999;
393
+ }
394
+ .file-item {
395
+ background: #f0f0f0;
396
+ padding: 8px 12px;
397
+ border-radius: 6px;
398
+ display: flex;
399
+ align-items: center;
400
+ gap: 8px;
401
+ font-size: 0.9em;
402
+ }
403
+ .file-item button {
404
+ background: none;
405
+ border: none;
406
+ color: var(--primary-color);
407
+ cursor: pointer;
408
+ font-size: 1em;
409
+ }
410
+ .input-row {
411
+ display: flex;
412
+ align-items: center;
413
+ gap: 12px;
414
+ flex-wrap: wrap;
415
+ background: #fff;
416
+ border: 1px solid #ddd;
417
+ border-radius: 24px;
418
+ padding: 6px 12px;
419
+ }
420
+ .attach-button {
421
+ background: none;
422
+ border: none;
423
+ font-size: 1.4em;
424
+ cursor: pointer;
425
+ color: #999;
426
+ display: flex;
427
+ align-items: center;
428
+ justify-content: center;
429
+ width: 36px;
430
+ height: 36px;
431
+ border-radius: 50%;
432
+ transition: background 0.3s;
433
+ }
434
+ .attach-button:hover {
435
+ background: #f0f0f0;
436
+ }
437
+ #fileInput {
438
+ display: none;
439
+ }
440
+ #chatInput {
441
+ flex: 1;
442
+ border: none;
443
+ outline: none;
444
+ resize: none;
445
+ overflow: scroll;
446
+ font-size: 1em;
447
+ height: 36px;
448
+ /* line-height: 36px; */
449
+ margin: 0;
450
+ padding: 0 8px;
451
+ padding: 10px;
452
+ max-height: 100px;
453
+ }
454
+ #chatInput::placeholder {
455
+ color: #999;
456
+ }
457
+ #sendBtn svg {
458
+ width: 16px !important;
459
+ height: 16px !important;
460
+ }
461
+ #sendBtn {
462
+ background: none;
463
+ border: none;
464
+ cursor: pointer;
465
+ width: 36px;
466
+ height: 36px;
467
+ border-radius: 50%;
468
+ display: flex;
469
+ align-items: center;
470
+ justify-content: center;
471
+ color: #999;
472
+ transition: background 0.3s;
473
+ }
474
+ #sendBtn:hover {
475
+ background: #f0f0f0;
476
+ }
477
+
478
+ /* Redesigned Summary Overlay with Max Height */
479
+ #summaryOverlay {
480
+ position: fixed;
481
+ bottom: 0;
482
+ left: 50%;
483
+ transform: translateX(-50%) translateY(100%);
484
+ width: 80%;
485
+ /* max-width: 600px; Wider than settings panel */
486
+ min-height: 30vh; /* Limit to half the viewport height */
487
+ max-height: 60vh; /* Limit to half the viewport height */
488
+ background: linear-gradient(135deg, #ffffff, #f7f7f7);
489
+ border-top-left-radius: 20px;
490
+ border-top-right-radius: 20px;
491
+ box-shadow: 0 -4px 20px rgba(0, 0, 0, 0.1);
492
+ transition: transform 0.3s ease;
493
+ z-index: 20;
494
+ display: flex;
495
+ flex-direction: column;
496
+ overflow: hidden;
497
+ /* padding-bottom: 100px; */
498
+ }
499
+
500
+ #summaryOverlay.active {
501
+ transform: translateX(-50%) translateY(0);
502
+ }
503
+
504
+ .summary-header {
505
+ display: flex;
506
+ align-items: center;
507
+ justify-content: space-between;
508
+ padding: 16px 20px;
509
+ background: linear-gradient(135deg, #4a90e2, #4178c0);
510
+ color: #fff;
511
+ font-size: 1.5em;
512
+ border-top-left-radius: 20px;
513
+ border-top-right-radius: 20px;
514
+ box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
515
+ }
516
+
517
+ .summary-header span {
518
+ font-weight: 500;
519
+ }
520
+
521
+ .summary-header-buttons {
522
+ display: flex;
523
+ gap: 12px;
524
+ }
525
+
526
+ .download-summary {
527
+ background: #fff;
528
+ color: #4a90e2;
529
+ border: 1px solid #4a90e2;
530
+ border-radius: 8px;
531
+ padding: 6px 12px;
532
+ cursor: pointer;
533
+ transition: background 0.3s, color 0.3s;
534
+ }
535
+
536
+ .download-summary:hover {
537
+ background: #4a90e2;
538
+ color: #fff;
539
+ }
540
+
541
+ .close-summary {
542
+ background: none;
543
+ border: none;
544
+ color: #fff;
545
+ font-size: 1.8em;
546
+ cursor: pointer;
547
+ line-height: 1;
548
+ }
549
+
550
+ .summary-content {
551
+ padding-top: 20px;
552
+ padding-bottom: 20px;
553
+ padding-left: 50px;
554
+ padding-right: 50px;
555
+ overflow-y: auto; /* Scrollbar appears when content exceeds available height */
556
+ flex: 1;
557
+ font-size: 1em;
558
+ color: #333;
559
+ background: #fff;
560
+ }
561
+
562
+ /* Redesigned Settings Overlay */
563
+ #settingsOverlay {
564
+ position: fixed;
565
+ bottom: 0;
566
+ left: 50%;
567
+ transform: translateX(-50%) translateY(100%);
568
+ width: 90%;
569
+ max-width: 500px;
570
+ background: linear-gradient(135deg, #ffffff, #f7f7f7);
571
+ border-top-left-radius: 20px;
572
+ border-top-right-radius: 20px;
573
+ box-shadow: 0 -4px 20px rgba(0, 0, 0, 0.1);
574
+ transition: transform 0.3s ease;
575
+ z-index: 20;
576
+ display: flex;
577
+ flex-direction: column;
578
+ overflow: hidden;
579
+ padding-bottom: 50px;
580
+ }
581
+
582
+ #settingsOverlay.active {
583
+ transform: translateX(-50%) translateY(0);
584
+ }
585
+
586
+ .settings-header {
587
+ display: flex;
588
+ align-items: center;
589
+ justify-content: space-between;
590
+ padding: 16px 20px;
591
+ background: linear-gradient(135deg, #4a90e2, #4178c0);
592
+ color: #fff;
593
+ font-size: 1.5em;
594
+ border-top-left-radius: 20px;
595
+ border-top-right-radius: 20px;
596
+ box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
597
+ }
598
+
599
+ .settings-header span {
600
+ font-weight: 500;
601
+ }
602
+
603
+ .close-settings {
604
+ background: none;
605
+ border: none;
606
+ color: #fff;
607
+ font-size: 1.8em;
608
+ cursor: pointer;
609
+ line-height: 1;
610
+ }
611
+
612
+ .settings-content {
613
+ padding: 20px;
614
+ flex: 1;
615
+ overflow-y: auto;
616
+ font-size: 1em;
617
+ color: #333;
618
+ background: #fff;
619
+ }
620
+
621
+ .settings-group {
622
+ display: flex;
623
+ flex-direction: column;
624
+ margin-bottom: 20px;
625
+ }
626
+
627
+ .settings-group label {
628
+ margin-bottom: 8px;
629
+ font-weight: 500;
630
+ }
631
+
632
+ .settings-group input[type="range"],
633
+ .settings-group input[type="number"],
634
+ .settings-group select {
635
+ padding: 10px;
636
+ border: 1px solid #ccc;
637
+ border-radius: 8px;
638
+ outline: none;
639
+ width: 100%;
640
+ font-size: 1em;
641
+ background: #fefefe;
642
+ transition: border 0.3s;
643
+ }
644
+
645
+ .settings-group input[type="range"]:focus,
646
+ .settings-group input[type="number"]:focus,
647
+ .settings-group select:focus {
648
+ border-color: #4a90e2;
649
+ }
650
+
651
+ .save-settings {
652
+ background: linear-gradient(135deg, #4a90e2, #4178c0);
653
+ color: #fff;
654
+ border: none;
655
+ border-radius: 8px;
656
+ padding: 12px;
657
+ width: 100%;
658
+ font-size: 1em;
659
+ cursor: pointer;
660
+ transition: background 0.3s;
661
+ }
662
+
663
+ .save-settings:hover {
664
+ background: linear-gradient(135deg, #4178c0, #4a90e2);
665
+ }
666
+
667
+ @media (max-width: 600px) {
668
+ .nav-bar {
669
+ display: none;
670
+ }
671
+ .chat-wrapper {
672
+ margin: 0;
673
+ }
674
+ .message {
675
+ font-size: 0.95em;
676
+ max-width: 80%;
677
+ }
678
+ .nav {
679
+ width: 40px;
680
+ height: 40px;
681
+ font-size: 1.5em;
682
+ }
683
+ .card {
684
+ padding: 80px 20px 20px 20px;
685
+ }
686
+ .input-container {
687
+ padding: 15px;
688
+ }
689
+ .input-row {
690
+ flex-direction: row;
691
+ }
692
+ .input-container button {
693
+ width: auto;
694
+ }
695
+ #summaryOverlay,
696
+ #settingsOverlay {
697
+ width: 80%;
698
+ }
699
+ }
700
+
701
+ .svg-icon {
702
+ width: 32px; /* Adjust as needed */
703
+ height: 32px; /* Adjust as needed */
704
+ filter: invert(100%) brightness(300%);
705
+ }
706
+
707
+ .svg-icon-non-white {
708
+ width: 24px; /* Adjust as needed */
709
+ height: 24px; /* Adjust as needed */
710
+ /* filter: invert(100%) brightness(300%); */
711
+ }
712
+
713
+ .button-row {
714
+ display: flex;
715
+ justify-content: center;
716
+ gap: 10px; /* Optional: adjust spacing between buttons */
717
+ margin-bottom: 10px; /* Space between buttons and the rest of the input area */
718
+ }
719
+
720
+ .button-row button {
721
+ padding: 8px 16px;
722
+ background: var(--primary-color);
723
+ color: #fff;
724
+ border: none;
725
+ border-radius: var(--border-radius);
726
+ cursor: pointer;
727
+ transition: background 0.3s;
728
+ }
729
+
730
+ .button-row button:hover {
731
+ background: var(--primary-dark);
732
+ }
733
+
734
+ #customBtn2 {
735
+ padding-top: 4px;
736
+ padding-bottom: 4px;
737
+ padding-left: 12px;
738
+ padding-right: 12px;
739
+ font-size: 0.98em;
740
+ }
741
+
742
+ #customBtn2 img {
743
+ width: 24px;
744
+ height: 24px;
745
+ }
746
+
747
+ #customBtn3 {
748
+ padding-top: 4px;
749
+ padding-bottom: 4px;
750
+ padding-left: 12px;
751
+ padding-right: 12px;
752
+ font-size: 0.98em;
753
+ }
754
+
755
+ #customBtn3 img {
756
+ width: 24px;
757
+ height: 24px;
758
+ }
759
+
760
+ /* Markdown */
761
+ .markdown-body {
762
+ font-size: 1em;
763
+ line-height: 1.5;
764
+ white-space: normal; /* Ensure paragraphs and lists break onto new lines */
765
+ }
766
+
767
+ .markdown-body p {
768
+ margin: 0.75em 0; /* Add vertical space between paragraphs */
769
+ }
770
+
771
+ .markdown-body ul,
772
+ .markdown-body ol {
773
+ margin: 0.75em 0;
774
+ padding-left: 1.5em; /* Indent bullets/numbers */
775
+ }
776
+
777
+ .markdown-body li {
778
+ margin: 0.3em 0;
779
+ }
780
+
781
+ .markdown-body h1,
782
+ .markdown-body h2,
783
+ .markdown-body h3,
784
+ .markdown-body h4,
785
+ .markdown-body h5,
786
+ .markdown-body h6 {
787
+ margin-top: 1em;
788
+ margin-bottom: 0.5em;
789
+ font-weight: bold;
790
+ }
791
+
792
+ /* Keep custom layout/styling but let hljs control colors */
793
+ .markdown-body pre {
794
+ overflow-x: auto;
795
+ max-width: 100%;
796
+ white-space: pre;
797
+ -webkit-overflow-scrolling: touch;
798
+ padding: 10px;
799
+ border-radius: 6px;
800
+ /* Remove background-color and color so hljs can apply its own */
801
+ }
802
+
803
+ .markdown-body pre code.hljs {
804
+ /* Customize only specific properties */
805
+ padding: 10px;
806
+ border-radius: 6px;
807
+ /* Let the theme dictate the background and text colors */
808
+ }
809
+
810
+ .blinking-cursor {
811
+ display: inline-block;
812
+ width: 10px;
813
+ height: 1em;
814
+ background-color: currentColor;
815
+ margin-left: 2px;
816
+ animation: blink 1s steps(2, start) infinite;
817
+ }
818
+
819
+ @keyframes blink {
820
+ 50% { opacity: 0; }
821
+ 100% { opacity: 1; }
822
+ }
823
+
824
+ .markdown-body pre {
825
+ overflow-x: auto;
826
+ max-width: 100%;
827
+ white-space: pre;
828
+ -webkit-overflow-scrolling: touch;
829
+ padding: 10px;
830
+ border-radius: 6px;
831
+ /* Let the highlight.js theme control the background and colors */
832
+ position: relative; /* Ensure the copy button is positioned correctly */
833
+ }
834
+
835
+ .copy-button {
836
+ position: absolute;
837
+ top: 10px;
838
+ right: 10px;
839
+ background: rgba(0, 0, 0, 0.4);
840
+ color: #fff;
841
+ border: none;
842
+ padding: 4px 8px;
843
+ font-size: 0.75rem;
844
+ border-radius: 4px;
845
+ cursor: pointer;
846
+ opacity: 0.7;
847
+ transition: opacity 0.3s;
848
+ }
849
+
850
+ .copy-button:hover {
851
+ opacity: 1;
852
+ }
853
+
854
+ .loading-overlay {
855
+ position: fixed;
856
+ top: 0;
857
+ left: 0;
858
+ width: 100%;
859
+ height: 100%;
860
+ background: rgba(255, 255, 255, 0.8);
861
+ display: flex;
862
+ flex-direction: column;
863
+ justify-content: center;
864
+ align-items: center;
865
+ opacity: 0;
866
+ visibility: hidden;
867
+ pointer-events: none;
868
+ transition: opacity 0.5s ease, visibility 0.5s ease;
869
+ z-index: 1000;
870
+ }
871
+
872
+ .loading-overlay.active {
873
+ opacity: 1;
874
+ visibility: visible;
875
+ pointer-events: auto;
876
+ }
877
+
878
+ .loading-animation {
879
+ position: relative;
880
+ width: 80px;
881
+ height: 80px;
882
+ margin-bottom: 20px;
883
+ }
884
+
885
+ .loading-animation .ripple {
886
+ position: absolute;
887
+ border: 4px solid #4a90e2; /* Adjust the color to match your theme */
888
+ opacity: 1;
889
+ border-radius: 50%;
890
+ animation: ripple 1.5s cubic-bezier(0.66, 0, 0, 1) infinite;
891
+ }
892
+
893
+ .loading-animation .ripple:nth-child(2) {
894
+ animation-delay: -0.75s;
895
+ }
896
+
897
+ @keyframes ripple {
898
+ 0% {
899
+ top: 36px;
900
+ left: 36px;
901
+ width: 0;
902
+ height: 0;
903
+ opacity: 1;
904
+ }
905
+ 100% {
906
+ top: 0;
907
+ left: 0;
908
+ width: 72px;
909
+ height: 72px;
910
+ opacity: 0;
911
+ }
912
+ }
913
+
914
+ /* Container for the toggle switch */
915
+ .toggle-switch {
916
+ position: relative;
917
+ display: inline-block;
918
+ width: 50px;
919
+ height: 34px;
920
+ margin-left: 10px;
921
+ }
922
+
923
+ /* Hide the default checkbox */
924
+ .toggle-switch input {
925
+ opacity: 0;
926
+ width: 0;
927
+ height: 0;
928
+ }
929
+
930
+ /* The slider (background) */
931
+ .toggle-slider {
932
+ position: absolute;
933
+ cursor: pointer;
934
+ top: 0;
935
+ left: 0;
936
+ right: 0;
937
+ bottom: 0;
938
+ background-color: #ccc;
939
+ transition: background-color 0.4s, box-shadow 0.4s;
940
+ border-radius: 34px;
941
+ box-shadow: inset 0 0 5px rgba(0,0,0,0.3);
942
+ }
943
+
944
+ /* The knob */
945
+ .toggle-slider:before {
946
+ position: absolute;
947
+ content: "";
948
+ height: 26px;
949
+ width: 26px;
950
+ background-color: #fff;
951
+ transition: transform 0.4s, box-shadow 0.4s;
952
+ border-radius: 50%;
953
+ box-shadow: 0 2px 4px rgba(0,0,0,0.2);
954
+ }
955
+
956
+ /* When the checkbox is checked */
957
+ .toggle-switch input:checked + .toggle-slider {
958
+ background-color: #4a90e2;
959
+ box-shadow: inset 0 0 5px rgba(0,0,0,0.2);
960
+ }
961
+
962
+ .toggle-switch input:checked + .toggle-slider:before {
963
+ transform: translateX(26px);
964
+ box-shadow: 0 2px 6px rgba(0,0,0,0.3);
965
+ }
966
+
967
+ /* Optional: styling for the label text */
968
+ .toggle-label {
969
+ font-size: 1em;
970
+ vertical-align: middle;
971
+ color: #333;
972
+ }
973
+
974
+ .ai-meta {
975
+ display: flex;
976
+ justify-content: space-between;
977
+ margin-top: 8px;
978
+ font-size: 0.8em;
979
+ color: #666;
980
+ }
981
+
982
+ .toggle-btn-summarize {
983
+ padding: 8px 16px;
984
+ background-color: #ccc !important; /* Dim color when off */
985
+ color: #fff;
986
+ border: none;
987
+ border-radius: 20px;
988
+ font-size: 0.9em;
989
+ cursor: pointer;
990
+ transition: background-color 0.3s ease;
991
+ }
992
+
993
+ .toggle-btn-summarize.active {
994
+ background-color: #4a90e2 !important; /* Vivid color when active */
995
+ }
standalone/server/main.py CHANGED
@@ -3,14 +3,31 @@ import httpx
3
  from fastapi import FastAPI, Request, HTTPException
4
  from fastapi.responses import StreamingResponse
5
  from fastapi.middleware.cors import CORSMiddleware
6
- from stream import openai, anthropic, google, huggingface
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
 
8
  app = FastAPI()
9
- app.include_router(openai.router)
10
- app.include_router(anthropic.router)
11
- app.include_router(google.router)
12
- app.include_router(huggingface.router)
13
-
 
 
 
14
  # Allow all origins for testing (adjust for production)
15
  app.add_middleware(
16
  CORSMiddleware,
@@ -20,176 +37,8 @@ app.add_middleware(
20
  allow_headers=["*"],
21
  )
22
 
23
- # Replace these with secure methods in production
24
- import os
25
- from collections import defaultdict
26
-
27
- @app.post("/summarize_openai")
28
- async def summarize_openai(request: Request):
29
- try:
30
- body = await request.json()
31
- except Exception as e:
32
- raise HTTPException(status_code=400, detail="Invalid JSON payload") from e
33
-
34
- previous_summary = body.get("previous_summary", "")
35
- latest_conversation = body.get("latest_conversation", "")
36
- persona = body.get("persona", "helpful assistant")
37
- temperature = body.get("temperature", 0.7)
38
- max_tokens = body.get("max_tokens", 1024)
39
- model = body.get("model", MODEL_NAME)
40
-
41
- # Load the prompt from prompts.toml
42
- import tomli
43
- with open("../../configs/prompts.toml", "rb") as f:
44
- prompts_config = tomli.load(f)
45
-
46
- # Get the prompt and system prompt
47
- prompt_template = prompts_config["summarization"]["prompt"]
48
- system_prompt = prompts_config["summarization"]["system_prompt"]
49
-
50
- # Replace variables in the prompt
51
- prompt = prompt_template.replace("$previous_summary", previous_summary).replace("$latest_conversation", latest_conversation)
52
- system_prompt = system_prompt.replace("$persona", persona)
53
-
54
- # Using OpenAI's SDK
55
- from openai import AsyncOpenAI
56
-
57
- # Initialize the client with the API key
58
- client = AsyncOpenAI(api_key=OPENAI_API_KEY)
59
-
60
- try:
61
- print(f"Starting OpenAI summarization for model: {model}")
62
-
63
- # Use the SDK to create a completion
64
- response = await client.chat.completions.create(
65
- model=model,
66
- messages=[
67
- {"role": "system", "content": system_prompt},
68
- {"role": "user", "content": prompt}
69
- ],
70
- temperature=temperature,
71
- max_tokens=max_tokens
72
- )
73
-
74
- summary = response.choices[0].message.content
75
- print("OpenAI summarization completed successfully")
76
-
77
- return {"summary": summary}
78
-
79
- except Exception as e:
80
- print(f"Error during OpenAI summarization: {str(e)}")
81
- raise HTTPException(status_code=500, detail=f"Error during summarization: {str(e)}")
82
-
83
- @app.post("/summarize_anthropic")
84
- async def summarize_anthropic(request: Request):
85
- try:
86
- body = await request.json()
87
- except Exception as e:
88
- raise HTTPException(status_code=400, detail="Invalid JSON payload") from e
89
-
90
- previous_summary = body.get("previous_summary", "")
91
- latest_conversation = body.get("latest_conversation", "")
92
- persona = body.get("persona", "helpful assistant")
93
- temperature = body.get("temperature", 0.7)
94
- max_tokens = body.get("max_tokens", 1024)
95
- model = body.get("model", "claude-3-opus-20240229")
96
-
97
- # Load the prompt from prompts.toml
98
- import tomli
99
- with open("../../configs/prompts.toml", "rb") as f:
100
- prompts_config = tomli.load(f)
101
-
102
- # Get the prompt and system prompt
103
- prompt_template = prompts_config["summarization"]["prompt"]
104
- system_prompt = prompts_config["summarization"]["system_prompt"]
105
-
106
- # Replace variables in the prompt
107
- prompt = prompt_template.replace("$previous_summary", previous_summary).replace("$latest_conversation", latest_conversation)
108
- system_prompt = system_prompt.replace("$persona", persona)
109
-
110
- try:
111
- import anthropic
112
-
113
- # Initialize Anthropic client
114
- client = anthropic.Anthropic(api_key=ANTHROPIC_API_KEY)
115
-
116
- print(f"Starting Anthropic summarization for model: {model}")
117
-
118
- # Create the response
119
- response = client.messages.create(
120
- model=model,
121
- messages=[
122
- {"role": "user", "content": prompt}
123
- ],
124
- system=system_prompt,
125
- max_tokens=max_tokens,
126
- temperature=temperature
127
- )
128
-
129
- summary = response.content[0].text
130
- print("Anthropic summarization completed successfully")
131
-
132
- return {"summary": summary}
133
-
134
- except Exception as e:
135
- print(f"Error during Anthropic summarization: {str(e)}")
136
- raise HTTPException(status_code=500, detail=f"Error during summarization: {str(e)}")
137
-
138
- @app.post("/summarize_google")
139
- async def summarize_google(request: Request):
140
- try:
141
- body = await request.json()
142
- except Exception as e:
143
- raise HTTPException(status_code=400, detail="Invalid JSON payload") from e
144
-
145
- previous_summary = body.get("previous_summary", "")
146
- latest_conversation = body.get("latest_conversation", "")
147
- persona = body.get("persona", "helpful assistant")
148
- temperature = body.get("temperature", 0.7)
149
- max_tokens = body.get("max_tokens", 1024)
150
- model = body.get("model", "gemini-1.5-pro")
151
 
152
- # Load the prompt from prompts.toml
153
- import tomli
154
- with open("../../configs/prompts.toml", "rb") as f:
155
- prompts_config = tomli.load(f)
156
-
157
- # Get the prompt and system prompt
158
- prompt_template = prompts_config["summarization"]["prompt"]
159
- system_prompt = prompts_config["summarization"]["system_prompt"]
160
-
161
- # Replace variables in the prompt
162
- prompt = prompt_template.replace("$previous_summary", previous_summary).replace("$latest_conversation", latest_conversation)
163
- system_prompt = system_prompt.replace("$persona", persona)
164
-
165
- try:
166
- import google.generativeai as genai
167
-
168
- # Configure the Google API
169
- genai.configure(api_key=GOOGLE_API_KEY)
170
-
171
- # Initialize the model
172
- model_obj = genai.GenerativeModel(model_name=model)
173
-
174
- print(f"Starting Google summarization for model: {model}")
175
-
176
- # Combine system prompt and user prompt for Google's API
177
- combined_prompt = f"{system_prompt}\n\n{prompt}"
178
-
179
- # Generate the response
180
- response = model_obj.generate_content(
181
- contents=combined_prompt,
182
- generation_config=genai.types.GenerationConfig(
183
- temperature=temperature,
184
- max_output_tokens=max_tokens
185
- )
186
- )
187
-
188
- summary = response.text
189
- print("Google summarization completed successfully")
190
-
191
- return {"summary": summary}
192
-
193
- except Exception as e:
194
- print(f"Error during Google summarization: {str(e)}")
195
- raise HTTPException(status_code=500, detail=f"Error during summarization: {str(e)}")
 
3
  from fastapi import FastAPI, Request, HTTPException
4
  from fastapi.responses import StreamingResponse
5
  from fastapi.middleware.cors import CORSMiddleware
6
+ from fastapi.staticfiles import StaticFiles
7
+
8
+ from stream import (
9
+ openai as openai_stream,
10
+ anthropic as anthropic_stream,
11
+ google as google_stream,
12
+ huggingface as huggingface_stream,
13
+ mistral as mistral_stream
14
+ )
15
+ from summary import (
16
+ openai as openai_summary,
17
+ google as google_summary,
18
+ # anthropic as anthropic_summary, google as google_summary,
19
+ # huggingface as huggingface_summary, mistral as mistral_summary
20
+ )
21
 
22
  app = FastAPI()
23
+ app.include_router(openai_stream.router)
24
+ app.include_router(anthropic_stream.router)
25
+ app.include_router(google_stream.router)
26
+ app.include_router(huggingface_stream.router)
27
+ app.include_router(mistral_stream.router)
28
+
29
+ app.include_router(openai_summary.router)
30
+ app.include_router(google_summary.router)
31
  # Allow all origins for testing (adjust for production)
32
  app.add_middleware(
33
  CORSMiddleware,
 
37
  allow_headers=["*"],
38
  )
39
 
40
+ app.mount("/", StaticFiles(directory="../front", html=True), name="static")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
41
 
42
+ if __name__ == "__main__":
43
+ import uvicorn
44
+ uvicorn.run("main:app", host="127.0.0.1", port=8000, reload=True)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
standalone/server/stream/mistral.py ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import json
3
+ from fastapi import FastAPI, Request, HTTPException
4
+ from fastapi.responses import StreamingResponse
5
+ from fastapi import APIRouter
6
+
7
+ from mistralai import Mistral
8
+
9
+ from .utils import handle_attachments, extract_text_from_pdf
10
+
11
+ router = APIRouter()
12
+
13
+ MISTRAL_API_KEY = os.environ.get("MISTRAL_API_KEY")
14
+ client = Mistral(api_key=MISTRAL_API_KEY)
15
+
16
+ attachments_in_mistral = {}
17
+
18
+ @router.post("/mistral_stream")
19
+ async def mistral_stream(request: Request):
20
+ try:
21
+ body = await request.json()
22
+ except Exception as e:
23
+ raise HTTPException(status_code=400, detail="Invalid JSON payload") from e
24
+
25
+ conversation = body.get("conversation")
26
+ if not conversation:
27
+ raise HTTPException(status_code=400, detail="Missing 'conversation' in payload")
28
+
29
+ print("--------------------------------")
30
+ print(body)
31
+ print()
32
+ temperature = body.get("temperature", 0.7)
33
+ max_tokens = body.get("max_tokens", 256)
34
+ model = body.get("model", "mistral-small-latest")
35
+ if "codestral" in model: model = model.replace("mistral-", "")
36
+ if "ministral" in model: model = model.replace("mistral-", "")
37
+
38
+ # Get session ID from the request
39
+ session_id = request.headers.get("X-Session-ID")
40
+ if session_id not in attachments_in_mistral: attachments_in_mistral[session_id] = {}
41
+ if not session_id:
42
+ raise HTTPException(status_code=400, detail="Missing 'session_id' in payload")
43
+
44
+ # Handle file attachments if present
45
+ conversation = await handle_attachments(session_id, conversation)
46
+ mistral_messages = []
47
+ for msg in conversation:
48
+ role = "user" if msg["role"] == "user" else "assistant"
49
+
50
+ pdf_texts = []
51
+ if "attachments" in msg:
52
+ for attachment in msg["attachments"]:
53
+ if attachment["file_path"].endswith(".pdf"):
54
+ if attachment["file_path"] not in attachments_in_mistral[session_id]:
55
+ pdf_text = await extract_text_from_pdf(attachment["file_path"])
56
+ pdf_texts.append([attachment["name"], pdf_text])
57
+ attachments_in_mistral[session_id][attachment["name"]] = pdf_text
58
+ else:
59
+ pdf_texts.append([attachment["name"], attachments_in_mistral[session_id][attachment["name"]]])
60
+
61
+ mistral_messages.append({"role": role, "content": msg["content"]})
62
+ for pdf_text in pdf_texts:
63
+ mistral_messages.append({"role": "user", "content": f"{pdf_text[0]}\n\n{pdf_text[1]}"})
64
+
65
+ async def event_generator():
66
+ try:
67
+ print(f"Starting stream for model: {model}, temperature: {temperature}, max_tokens: {max_tokens}")
68
+ line_count = 0
69
+
70
+ # Use the SDK to create a streaming completion
71
+ stream = await client.chat.stream_async(
72
+ model=model,
73
+ messages=mistral_messages,
74
+ temperature=temperature,
75
+ max_tokens=max_tokens,
76
+ )
77
+
78
+ async for chunk in stream:
79
+ print(chunk.__dict__)
80
+ if chunk.data.choices and chunk.data.choices[0].delta.content is not None:
81
+ content = chunk.data.choices[0].delta.content
82
+ line_count += 1
83
+ if line_count % 10 == 0:
84
+ print(f"Processed {line_count} stream chunks")
85
+
86
+ # Format the response in the same way as OpenAI
87
+ response_json = json.dumps({
88
+ "choices": [{"delta": {"content": content}}]
89
+ })
90
+ yield f"data: {response_json}\n\n"
91
+
92
+ # Send the [DONE] marker
93
+ print("Stream completed successfully")
94
+ yield "data: [DONE]\n\n"
95
+
96
+ except Exception as e:
97
+ print(f"Error during streaming: {str(e)}")
98
+ yield f"data: {{\"error\": \"{str(e)}\"}}\n\n"
99
+ finally:
100
+ print(f"Stream ended after processing {line_count if 'line_count' in locals() else 0} chunks")
101
+
102
+ print("Returning StreamingResponse to client")
103
+ return StreamingResponse(event_generator(), media_type="text/event-stream")
104
+
standalone/server/stream/openai.py CHANGED
@@ -1,5 +1,6 @@
1
  import os
2
  import json
 
3
  from fastapi import FastAPI, Request, HTTPException
4
  from fastapi.responses import StreamingResponse
5
  from fastapi import APIRouter
@@ -33,9 +34,10 @@ async def openai_stream(request: Request):
33
  max_tokens = body.get("max_tokens", 256)
34
  model = body.get("model", "gpt-4o-mini")
35
 
36
- # Get session ID from the request
37
  session_id = request.headers.get("X-Session-ID")
38
- if session_id not in attachments_in_openai: attachments_in_openai[session_id] = {}
 
39
  if not session_id:
40
  raise HTTPException(status_code=400, detail="Missing 'session_id' in payload")
41
 
@@ -49,7 +51,7 @@ async def openai_stream(request: Request):
49
  if "attachments" in msg:
50
  for attachment in msg["attachments"]:
51
  if attachment["file_path"].endswith(".pdf"):
52
- if attachment["file_path"] not in attachments_in_openai[session_id]:
53
  pdf_text = await extract_text_from_pdf(attachment["file_path"])
54
  pdf_texts.append([attachment["name"], pdf_text])
55
  attachments_in_openai[session_id][attachment["name"]] = pdf_text
@@ -61,9 +63,9 @@ async def openai_stream(request: Request):
61
  gpt_messages.append({"role": "user", "content": f"{pdf_text[0]}\n\n{pdf_text[1]}"})
62
 
63
  async def event_generator():
 
64
  try:
65
  print(f"Starting stream for model: {model}, temperature: {temperature}, max_tokens: {max_tokens}")
66
- line_count = 0
67
 
68
  # Use the SDK to create a streaming completion
69
  stream = await client.chat.completions.create(
@@ -81,21 +83,23 @@ async def openai_stream(request: Request):
81
  if line_count % 10 == 0:
82
  print(f"Processed {line_count} stream chunks")
83
 
84
- # Format the response in the same way as before
85
  response_json = json.dumps({
86
  "choices": [{"delta": {"content": content}}]
87
  })
88
  yield f"data: {response_json}\n\n"
89
 
90
- # Send the [DONE] marker
91
  print("Stream completed successfully")
92
  yield "data: [DONE]\n\n"
93
 
 
 
 
 
94
  except Exception as e:
95
  print(f"Error during streaming: {str(e)}")
96
  yield f"data: {{\"error\": \"{str(e)}\"}}\n\n"
97
  finally:
98
- print(f"Stream ended after processing {line_count if 'line_count' in locals() else 0} chunks")
99
 
100
  print("Returning StreamingResponse to client")
101
- return StreamingResponse(event_generator(), media_type="text/event-stream")
 
1
  import os
2
  import json
3
+ import asyncio
4
  from fastapi import FastAPI, Request, HTTPException
5
  from fastapi.responses import StreamingResponse
6
  from fastapi import APIRouter
 
34
  max_tokens = body.get("max_tokens", 256)
35
  model = body.get("model", "gpt-4o-mini")
36
 
37
+ # Get session ID from the request headers
38
  session_id = request.headers.get("X-Session-ID")
39
+ if session_id not in attachments_in_openai:
40
+ attachments_in_openai[session_id] = {}
41
  if not session_id:
42
  raise HTTPException(status_code=400, detail="Missing 'session_id' in payload")
43
 
 
51
  if "attachments" in msg:
52
  for attachment in msg["attachments"]:
53
  if attachment["file_path"].endswith(".pdf"):
54
+ if attachment["file_path"] not in attachments_in_openai[session_id]:
55
  pdf_text = await extract_text_from_pdf(attachment["file_path"])
56
  pdf_texts.append([attachment["name"], pdf_text])
57
  attachments_in_openai[session_id][attachment["name"]] = pdf_text
 
63
  gpt_messages.append({"role": "user", "content": f"{pdf_text[0]}\n\n{pdf_text[1]}"})
64
 
65
  async def event_generator():
66
+ line_count = 0
67
  try:
68
  print(f"Starting stream for model: {model}, temperature: {temperature}, max_tokens: {max_tokens}")
 
69
 
70
  # Use the SDK to create a streaming completion
71
  stream = await client.chat.completions.create(
 
83
  if line_count % 10 == 0:
84
  print(f"Processed {line_count} stream chunks")
85
 
 
86
  response_json = json.dumps({
87
  "choices": [{"delta": {"content": content}}]
88
  })
89
  yield f"data: {response_json}\n\n"
90
 
 
91
  print("Stream completed successfully")
92
  yield "data: [DONE]\n\n"
93
 
94
+ except asyncio.CancelledError:
95
+ print("Streaming aborted by client")
96
+ # You can choose to do cleanup here if needed.
97
+ raise # Re-raise to allow FastAPI to handle the cancellation properly.
98
  except Exception as e:
99
  print(f"Error during streaming: {str(e)}")
100
  yield f"data: {{\"error\": \"{str(e)}\"}}\n\n"
101
  finally:
102
+ print(f"Stream ended after processing {line_count} chunks")
103
 
104
  print("Returning StreamingResponse to client")
105
+ return StreamingResponse(event_generator(), media_type="text/event-stream")
standalone/server/summary/__init__.py ADDED
@@ -0,0 +1 @@
 
 
1
+ prev_summaries = {}
standalone/server/summary/google.py ADDED
@@ -0,0 +1,82 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import json
3
+ import tomli
4
+ from string import Template
5
+
6
+ from fastapi import FastAPI, Request, HTTPException
7
+ from fastapi import APIRouter
8
+
9
+ from google.genai import types
10
+ from google import genai
11
+
12
+ from . import prev_summaries
13
+
14
+ router = APIRouter()
15
+
16
+ GOOGLE_API_KEY = os.environ.get("GOOGLE_API_KEY")
17
+ client = genai.client.AsyncClient(genai.client.ApiClient(api_key=GOOGLE_API_KEY))
18
+
19
+ @router.post("/gemini_summary")
20
+ async def gemini_summary(request: Request):
21
+ try:
22
+ body = await request.json()
23
+ except Exception as e:
24
+ raise HTTPException(status_code=400, detail="Invalid JSON payload") from e
25
+
26
+ conversation = body.get("conversation")
27
+ if not conversation:
28
+ raise HTTPException(status_code=400, detail="Missing 'conversation' in payload")
29
+
30
+ print("--------------------------------")
31
+ print(body)
32
+ print()
33
+ temperature = body.get("temperature", 0.7)
34
+ max_tokens = body.get("max_tokens", 256)
35
+ model = body.get("model", "gemini-1.5-flash")
36
+
37
+ # Get session ID from the request
38
+ session_id = request.headers.get("X-Session-ID")
39
+ if not session_id:
40
+ raise HTTPException(status_code=400, detail="Missing 'session_id' in payload")
41
+
42
+ if session_id not in prev_summaries:
43
+ prev_summaries[session_id] = ""
44
+
45
+ prev_summary = prev_summaries[session_id]
46
+
47
+ with open("../../configs/prompts.toml", "rb") as f:
48
+ prompts = tomli.load(f)
49
+
50
+ prompt = Template(prompts["summarization"]["prompt"])
51
+ system_prompt = Template(prompts["summarization"]["system_prompt"])
52
+
53
+ latest_conversations = conversation[-2:]
54
+
55
+ for i, latest_conversation in enumerate(latest_conversations):
56
+ # if "attachments" in latest_conversation:
57
+ if "attachments" in latest_conversation:
58
+ del latest_conversation["attachments"]
59
+ if "sessionId" in latest_conversation:
60
+ del latest_conversation["sessionId"]
61
+ latest_conversations[i] = latest_conversation
62
+
63
+ summary = await client.models.generate_content(
64
+ model=model,
65
+ contents=[
66
+ prompt.safe_substitute(
67
+ previous_summary=prev_summary,
68
+ latest_conversation=str(latest_conversations)
69
+ )
70
+ ],
71
+ config=types.GenerateContentConfig(
72
+ system_instruction=system_prompt.substitute(persona="professional"),
73
+ temperature=temperature,
74
+ max_output_tokens=max_tokens,
75
+ top_p=0.95,
76
+ )
77
+ )
78
+
79
+ print(summary)
80
+ summary_text = summary.text
81
+ prev_summaries[session_id] = summary_text
82
+ return {"summary": summary_text}
standalone/server/summary/openai.py ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import json
3
+ import tomli
4
+ from string import Template
5
+
6
+ from fastapi import FastAPI, Request, HTTPException
7
+ from fastapi import APIRouter
8
+
9
+ from openai import AsyncOpenAI
10
+
11
+ from . import prev_summaries
12
+ router = APIRouter()
13
+
14
+ OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
15
+ client = AsyncOpenAI(api_key=OPENAI_API_KEY)
16
+
17
+ @router.post("/openai_summary")
18
+ async def openai_summary(request: Request):
19
+ try:
20
+ body = await request.json()
21
+ except Exception as e:
22
+ raise HTTPException(status_code=400, detail="Invalid JSON payload") from e
23
+
24
+ conversation = body.get("conversation")
25
+ if not conversation:
26
+ raise HTTPException(status_code=400, detail="Missing 'conversation' in payload")
27
+
28
+ print("--------------------------------")
29
+ print(body)
30
+ print()
31
+ temperature = body.get("temperature", 0.7)
32
+ max_tokens = body.get("max_tokens", 256)
33
+ model = body.get("model", "gpt-4o-mini")
34
+
35
+ # Get session ID from the request
36
+ session_id = request.headers.get("X-Session-ID")
37
+ if not session_id:
38
+ raise HTTPException(status_code=400, detail="Missing 'session_id' in payload")
39
+
40
+ if session_id not in prev_summaries:
41
+ prev_summaries[session_id] = ""
42
+
43
+ prev_summary = prev_summaries[session_id]
44
+
45
+ with open("../../configs/prompts.toml", "rb") as f:
46
+ prompts = tomli.load(f)
47
+
48
+ prompt = Template(prompts["summarization"]["prompt"])
49
+ system_prompt = Template(prompts["summarization"]["system_prompt"])
50
+
51
+ latest_conversations = conversation[-2:]
52
+
53
+ for i, latest_conversation in enumerate(latest_conversations):
54
+ # if "attachments" in latest_conversation:
55
+ if "attachments" in latest_conversation:
56
+ del latest_conversation["attachments"]
57
+ if "sessionId" in latest_conversation:
58
+ del latest_conversation["sessionId"]
59
+ latest_conversations[i] = latest_conversation
60
+
61
+ summary = await client.chat.completions.create(
62
+ model=model,
63
+ messages=[
64
+ {"role": "system", "content": system_prompt.substitute(persona="professional")},
65
+ {"role": "user", "content": prompt.substitute(previous_summary=prev_summary, latest_conversation=str(latest_conversations))}
66
+ ],
67
+ max_tokens=max_tokens,
68
+ temperature=temperature
69
+ )
70
+
71
+ summary_text = summary.choices[0].message.content
72
+ prev_summaries[session_id] = summary_text
73
+ return {"summary": summary_text}