mgbam commited on
Commit
59df69a
·
verified ·
1 Parent(s): 5ab43e9

Delete notebooks/demo_notebook.ipynb

Browse files
Files changed (1) hide show
  1. notebooks/demo_notebook.ipynb +0 -103
notebooks/demo_notebook.ipynb DELETED
@@ -1,103 +0,0 @@
1
- {
2
- "cells": [
3
- {
4
- "cell_type": "markdown",
5
- "metadata": {},
6
- "source": [
7
- "# AnyCoder Demo Notebook\n",
8
- "\n",
9
- "This notebook illustrates how to:\n",
10
- "\n",
11
- "1. **Instantiate** the unified `hf_client` with automatic provider routing.\n",
12
- "2. **Call** a chat completion (Groq → OpenAI → Gemini fall‑back).\n",
13
- "3. **Trigger** the FastAPI `/predict` endpoint served by *app.py*.\n",
14
- "4. **Run** a quick sentiment‑analysis pipeline using your preferred provider."
15
- ]
16
- },
17
- {
18
- "cell_type": "code",
19
- "execution_count": null,
20
- "metadata": {},
21
- "outputs": [],
22
- "source": [
23
- "# 1. Setup inference client\n",
24
- "from hf_client import get_inference_client\n",
25
- "\n",
26
- "# Choose a model (will route to best provider according to prefix rules)\n",
27
- "model_id = 'openai/gpt-4' # try 'gemini/pro' or any HF model path\n",
28
- "client = get_inference_client(model_id, provider='auto')\n",
29
- "\n",
30
- "# Simple chat completion\n",
31
- "resp = client.chat.completions.create(\n",
32
- " model=model_id,\n",
33
- " messages=[{'role': 'user', 'content': 'Write a Python function to reverse a string.'}]\n",
34
- ")\n",
35
- "print(resp.choices[0].message.content)"
36
- ]
37
- },
38
- {
39
- "cell_type": "code",
40
- "execution_count": null,
41
- "metadata": {},
42
- "outputs": [],
43
- "source": [
44
- "# 2. Sentiment analysis via HF Inference Providers (OpenAI GPT‑4)\n",
45
- "from transformers import pipeline\n",
46
- "\n",
47
- "sentiment = pipeline(\n",
48
- " 'sentiment-analysis', \n",
49
- " model='openai/gpt-4', # could be 'gemini/pro' etc.\n",
50
- " trust_remote_code=True\n",
51
- ")\n",
52
- "sentiment('I love building AI‑powered tools!')"
53
- ]
54
- },
55
- {
56
- "cell_type": "code",
57
- "execution_count": null,
58
- "metadata": {},
59
- "outputs": [],
60
- "source": [
61
- "# 3. Call the FastAPI /predict endpoint exposed by app.py\n",
62
- "import json, requests\n",
63
- "\n",
64
- "payload = {\n",
65
- " 'prompt': 'Generate a minimal HTML page.',\n",
66
- " 'model_id': 'gemini/pro',\n",
67
- " 'language': 'html',\n",
68
- " 'web_search': False\n",
69
- "}\n",
70
- "\n",
71
- "r = requests.post('http://localhost:7860/predict', json=payload)\n",
72
- "print('Status:', r.status_code)\n",
73
- "print(json.loads(r.text)['code'][:400])"
74
- ]
75
- },
76
- {
77
- "cell_type": "markdown",
78
- "metadata": {},
79
- "source": [
80
- "---\n",
81
- "## Next steps\n",
82
- "\n",
83
- "* Switch `model_id` to **`'gemini/pro'`**, **`'fireworks-ai/fireworks-v1'`**, or any HF model (e.g. `Qwen/Qwen3-32B`)—routing will adjust automatically.\n",
84
- "* Explore **`plugins.py`** for Slack / GitHub integrations.\n",
85
- "* Use **`auth.py`** helpers to pull private Google Drive docs into the pipeline.\n",
86
- "* Extend `/predict` with temperature, max‑tokens, or stream support."
87
- ]
88
- }
89
- ],
90
- "metadata": {
91
- "kernelspec": {
92
- "display_name": "Python 3",
93
- "language": "python",
94
- "name": "python3"
95
- },
96
- "language_info": {
97
- "name": "python",
98
- "version": "3.x"
99
- }
100
- },
101
- "nbformat": 4,
102
- "nbformat_minor": 5
103
- }