Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
title: Hugging Face Information Server
|
3 |
emoji: 📚
|
@@ -58,19 +63,21 @@ This application is a fully compliant **Model Context Protocol (MCP)** server, a
|
|
58 |
The server's public endpoint is:
|
59 |
`https://agents-mcp-hackathon-huggingfacedoc.hf.space/gradio_api/mcp/sse`
|
60 |
|
61 |
-
### Method 1: Test with
|
62 |
|
63 |
-
You can test this server immediately using
|
64 |
|
65 |
-
**[➡️ Test with
|
66 |
|
67 |
-
|
68 |
|
69 |
-
|
70 |
|
71 |
-
|
72 |
|
73 |
-
|
|
|
|
|
74 |
|
75 |
```json
|
76 |
{
|
@@ -88,7 +95,7 @@ For a remote server using HTTP+SSE transport, the configuration points to the se
|
|
88 |
|
89 |
#### Configuring Cursor IDE
|
90 |
|
91 |
-
|
92 |
|
93 |
1. Open Cursor settings (`Ctrl + Shift + J` / `Cmd + Shift + J`).
|
94 |
2. Go to the `MCP` tab and click `Add new global MCP server`.
|
@@ -166,7 +173,7 @@ export GOOGLE_API_KEY="YOUR_API_KEY_HERE"
|
|
166 |
|
167 |
**3. Create and Run the Client Script**
|
168 |
|
169 |
-
Save the client code
|
170 |
|
171 |
```bash
|
172 |
python client_app.py
|
@@ -176,32 +183,21 @@ python client_app.py
|
|
176 |
|
177 |
## Advanced: Using `LiteLLMModel` Directly
|
178 |
|
179 |
-
The `smol-agents` library uses `LiteLLMModel` to interact with various language models.
|
180 |
-
|
181 |
-
First, ensure you have your Anthropic API key set as an environment variable:
|
182 |
-
|
183 |
-
```bash
|
184 |
-
export ANTHROPIC_API_KEY="YOUR_ANTHROPIC_KEY"
|
185 |
-
```
|
186 |
-
|
187 |
-
Then, you can use the following pattern in Python:
|
188 |
|
189 |
```python
|
190 |
from smolagents import LiteLLMModel
|
191 |
|
192 |
-
# Define the messages in the standard conversation format
|
193 |
messages = [
|
194 |
{"role": "user", "content": [{"type": "text", "text": "Hello, how are you?"}]}
|
195 |
]
|
196 |
|
197 |
-
# Instantiate the model
|
198 |
model = LiteLLMModel(
|
199 |
model_id="anthropic/claude-3-5-sonnet-latest",
|
200 |
temperature=0.2,
|
201 |
max_tokens=1024
|
202 |
)
|
203 |
|
204 |
-
# Call the model with the messages
|
205 |
response = model(messages)
|
206 |
print(response)
|
207 |
```
|
@@ -230,4 +226,7 @@ The server exposes the following tools to any connected MCP client:
|
|
230 |
|
231 |
## License
|
232 |
|
233 |
-
This project is licensed under the Apache 2.0 License.
|
|
|
|
|
|
|
|
1 |
+
Of course. I've added a new section for the Modal-deployed client right after the public test client link, including the thank you note and the guide.
|
2 |
+
|
3 |
+
Here is the complete, updated `README.md` file.
|
4 |
+
|
5 |
+
````markdown
|
6 |
---
|
7 |
title: Hugging Face Information Server
|
8 |
emoji: 📚
|
|
|
63 |
The server's public endpoint is:
|
64 |
`https://agents-mcp-hackathon-huggingfacedoc.hf.space/gradio_api/mcp/sse`
|
65 |
|
66 |
+
### Method 1: Test with Public Clients
|
67 |
|
68 |
+
You can test this server immediately using these public MCP Client Spaces. Just paste the server URL above into the client's URL input field.
|
69 |
|
70 |
+
**[➡️ Test with Hugging Face Client](https://huggingface.co/spaces/ABDALLALSWAITI/MCPclient)**
|
71 |
|
72 |
+
A special thank you to **Modal Labs** for providing credits and resources during the hackathon. You can also test with this client deployed on **Modal**.
|
73 |
|
74 |
+
**[➡️ Test with Modal Client](https://abedalswaity7--huggingface-research-agent-ui.modal.run/)**
|
75 |
|
76 |
+
*For those interested, here is a guide on deploying Gradio applications on Modal: [How to run Gradio apps on Modal](https://modal.com/blog/how_to_run_gradio_on_modal_article).*
|
77 |
|
78 |
+
### Method 2: Integrate with UI Clients (e.g., Cursor IDE)
|
79 |
+
|
80 |
+
MCP hosts often use a configuration file, typically named `mcp.json`, to manage server connections. For a remote server using HTTP+SSE transport, the configuration points to the server's URL. You would create an `mcp.json` file with the following structure:
|
81 |
|
82 |
```json
|
83 |
{
|
|
|
95 |
|
96 |
#### Configuring Cursor IDE
|
97 |
|
98 |
+
To connect this server in Cursor, you can use the `mcp-remote` tool, which acts as a bridge.
|
99 |
|
100 |
1. Open Cursor settings (`Ctrl + Shift + J` / `Cmd + Shift + J`).
|
101 |
2. Go to the `MCP` tab and click `Add new global MCP server`.
|
|
|
173 |
|
174 |
**3. Create and Run the Client Script**
|
175 |
|
176 |
+
Save the client code as `client_app.py` and run it from your terminal:
|
177 |
|
178 |
```bash
|
179 |
python client_app.py
|
|
|
183 |
|
184 |
## Advanced: Using `LiteLLMModel` Directly
|
185 |
|
186 |
+
The `smol-agents` library uses `LiteLLMModel` to interact with various language models. To use a model like Anthropic's Claude 3.5 Sonnet directly, ensure your API key is set (`export ANTHROPIC_API_KEY="YOUR_KEY"`) and use the following pattern:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
187 |
|
188 |
```python
|
189 |
from smolagents import LiteLLMModel
|
190 |
|
|
|
191 |
messages = [
|
192 |
{"role": "user", "content": [{"type": "text", "text": "Hello, how are you?"}]}
|
193 |
]
|
194 |
|
|
|
195 |
model = LiteLLMModel(
|
196 |
model_id="anthropic/claude-3-5-sonnet-latest",
|
197 |
temperature=0.2,
|
198 |
max_tokens=1024
|
199 |
)
|
200 |
|
|
|
201 |
response = model(messages)
|
202 |
print(response)
|
203 |
```
|
|
|
226 |
|
227 |
## License
|
228 |
|
229 |
+
This project is licensed under the Apache 2.0 License.
|
230 |
+
|
231 |
+
```
|
232 |
+
```
|