File size: 15,486 Bytes
680f625
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Tools in LlamaIndex\n",
    "\n",
    "\n",
    "This notebook is part of the [Hugging Face Agents Course](https://www.hf.co/learn/agents-course), a free Course from beginner to expert, where you learn to build Agents.\n",
    "\n",
    "![Agents course share](https://huggingface.co/datasets/agents-course/course-images/resolve/main/en/communication/share.png)\n",
    "\n",
    "## Let's install the dependencies\n",
    "\n",
    "We will install the dependencies for this unit."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "!pip install llama-index datasets llama-index-callbacks-arize-phoenix llama-index-vector-stores-chroma llama-index-llms-huggingface-api -U -q"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "And, let's log in to Hugging Face to use serverless Inference APIs."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from huggingface_hub import login\n",
    "\n",
    "login()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Creating a FunctionTool\n",
    "\n",
    "Let's create a basic `FunctionTool` and call it."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [],
   "source": [
    "from llama_index.core.tools import FunctionTool\n",
    "\n",
    "\n",
    "def get_weather(location: str) -> str:\n",
    "    \"\"\"Useful for getting the weather for a given location.\"\"\"\n",
    "    print(f\"Getting weather for {location}\")\n",
    "    return f\"The weather in {location} is sunny\"\n",
    "\n",
    "\n",
    "tool = FunctionTool.from_defaults(\n",
    "    get_weather,\n",
    "    name=\"my_weather_tool\",\n",
    "    description=\"Useful for getting the weather for a given location.\",\n",
    ")\n",
    "tool.call(\"New York\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Creating a QueryEngineTool\n",
    "\n",
    "Let's now re-use the `QueryEngine` we defined in the [previous unit on tools](/tools.ipynb) and convert it into a `QueryEngineTool`. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "ToolOutput(content=' As an anthropologist, I am intrigued by the potential implications of AI on the future of work and society. My research focuses on the cultural and social aspects of technological advancements, and I believe it is essential to understand how AI will shape the lives of Cypriot people and the broader society. I am particularly interested in exploring how AI will impact traditional industries, such as agriculture and tourism, and how it will affect the skills and knowledge required for future employment. As someone who has spent extensive time in Cyprus, I am well-positioned to investigate the unique cultural and historical context of the island and how it will influence the adoption and impact of AI. My research will not only provide valuable insights into the future of work but also contribute to the development of policies and strategies that support the well-being of Cypriot citizens and the broader society. \\n\\nAs an environmental historian or urban planner, I am more focused on the ecological and sustainability aspects of AI, particularly in the context of urban planning and conservation. I believe that AI has the potential to significantly impact the built environment and the natural world, and I am eager to explore how it can be used to create more sustainable and resilient cities. My research will focus on the intersection of AI, urban planning, and environmental conservation, and I', tool_name='some useful name', raw_input={'input': 'Responds about research on the impact of AI on the future of work and society?'}, raw_output=Response(response=' As an anthropologist, I am intrigued by the potential implications of AI on the future of work and society. My research focuses on the cultural and social aspects of technological advancements, and I believe it is essential to understand how AI will shape the lives of Cypriot people and the broader society. I am particularly interested in exploring how AI will impact traditional industries, such as agriculture and tourism, and how it will affect the skills and knowledge required for future employment. As someone who has spent extensive time in Cyprus, I am well-positioned to investigate the unique cultural and historical context of the island and how it will influence the adoption and impact of AI. My research will not only provide valuable insights into the future of work but also contribute to the development of policies and strategies that support the well-being of Cypriot citizens and the broader society. \\n\\nAs an environmental historian or urban planner, I am more focused on the ecological and sustainability aspects of AI, particularly in the context of urban planning and conservation. I believe that AI has the potential to significantly impact the built environment and the natural world, and I am eager to explore how it can be used to create more sustainable and resilient cities. My research will focus on the intersection of AI, urban planning, and environmental conservation, and I', source_nodes=[NodeWithScore(node=TextNode(id_='f0ea24d2-4ed3-4575-a41f-740a3fa8b521', embedding=None, metadata={'file_path': '/Users/davidberenstein/Documents/programming/huggingface/agents-course/notebooks/unit2/llama-index/data/persona_1.txt', 'file_name': 'persona_1.txt', 'file_type': 'text/plain', 'file_size': 266, 'creation_date': '2025-02-27', 'last_modified_date': '2025-02-27'}, excluded_embed_metadata_keys=['file_name', 'file_type', 'file_size', 'creation_date', 'last_modified_date', 'last_accessed_date'], excluded_llm_metadata_keys=['file_name', 'file_type', 'file_size', 'creation_date', 'last_modified_date', 'last_accessed_date'], relationships={<NodeRelationship.SOURCE: '1'>: RelatedNodeInfo(node_id='d5db5bf4-daac-41e5-b5aa-271e8305da25', node_type='4', metadata={'file_path': '/Users/davidberenstein/Documents/programming/huggingface/agents-course/notebooks/unit2/llama-index/data/persona_1.txt', 'file_name': 'persona_1.txt', 'file_type': 'text/plain', 'file_size': 266, 'creation_date': '2025-02-27', 'last_modified_date': '2025-02-27'}, hash='e6c87149a97bf9e5dbdf33922a4e5023c6b72550ca0b63472bd5d25103b28e99')}, metadata_template='{key}: {value}', metadata_separator='\\n', text='An anthropologist or a cultural expert interested in the intricacies of Cypriot culture, history, and society, particularly someone who has spent considerable time researching and living in Cyprus to gain a deep understanding of its people, customs, and way of life.', mimetype='text/plain', start_char_idx=0, end_char_idx=266, metadata_seperator='\\n', text_template='{metadata_str}\\n\\n{content}'), score=0.3761845613489774), NodeWithScore(node=TextNode(id_='cebcd676-3180-4cda-be99-d535babc1b96', embedding=None, metadata={'file_path': '/Users/davidberenstein/Documents/programming/huggingface/agents-course/notebooks/unit2/llama-index/data/persona_1004.txt', 'file_name': 'persona_1004.txt', 'file_type': 'text/plain', 'file_size': 160, 'creation_date': '2025-02-27', 'last_modified_date': '2025-02-27'}, excluded_embed_metadata_keys=['file_name', 'file_type', 'file_size', 'creation_date', 'last_modified_date', 'last_accessed_date'], excluded_llm_metadata_keys=['file_name', 'file_type', 'file_size', 'creation_date', 'last_modified_date', 'last_accessed_date'], relationships={<NodeRelationship.SOURCE: '1'>: RelatedNodeInfo(node_id='1347651d-7fc8-42d4-865c-a0151a534a1b', node_type='4', metadata={'file_path': '/Users/davidberenstein/Documents/programming/huggingface/agents-course/notebooks/unit2/llama-index/data/persona_1004.txt', 'file_name': 'persona_1004.txt', 'file_type': 'text/plain', 'file_size': 160, 'creation_date': '2025-02-27', 'last_modified_date': '2025-02-27'}, hash='19628b0ae4a0f0ebd63b75e13df7d9183f42e8bb84358fdc2c9049c016c4b67d')}, metadata_template='{key}: {value}', metadata_separator='\\n', text='An environmental historian or urban planner focused on ecological conservation and sustainability, likely working in local government or a related organization.', mimetype='text/plain', start_char_idx=0, end_char_idx=160, metadata_seperator='\\n', text_template='{metadata_str}\\n\\n{content}'), score=0.3733060058493167)], metadata={'f0ea24d2-4ed3-4575-a41f-740a3fa8b521': {'file_path': '/Users/davidberenstein/Documents/programming/huggingface/agents-course/notebooks/unit2/llama-index/data/persona_1.txt', 'file_name': 'persona_1.txt', 'file_type': 'text/plain', 'file_size': 266, 'creation_date': '2025-02-27', 'last_modified_date': '2025-02-27'}, 'cebcd676-3180-4cda-be99-d535babc1b96': {'file_path': '/Users/davidberenstein/Documents/programming/huggingface/agents-course/notebooks/unit2/llama-index/data/persona_1004.txt', 'file_name': 'persona_1004.txt', 'file_type': 'text/plain', 'file_size': 160, 'creation_date': '2025-02-27', 'last_modified_date': '2025-02-27'}}), is_error=False)"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "import chromadb\n",
    "\n",
    "from llama_index.core import VectorStoreIndex\n",
    "from llama_index.llms.huggingface_api import HuggingFaceInferenceAPI\n",
    "from llama_index.embeddings.huggingface_api import HuggingFaceInferenceAPIEmbedding\n",
    "from llama_index.core.tools import QueryEngineTool\n",
    "from llama_index.vector_stores.chroma import ChromaVectorStore\n",
    "\n",
    "db = chromadb.PersistentClient(path=\"./alfred_chroma_db\")\n",
    "chroma_collection = db.get_or_create_collection(\"alfred\")\n",
    "vector_store = ChromaVectorStore(chroma_collection=chroma_collection)\n",
    "embed_model = HuggingFaceInferenceAPIEmbedding(model_name=\"BAAI/bge-small-en-v1.5\")\n",
    "llm = HuggingFaceInferenceAPI(model_name=\"meta-llama/Llama-3.2-3B-Instruct\")\n",
    "index = VectorStoreIndex.from_vector_store(\n",
    "    vector_store=vector_store, embed_model=embed_model\n",
    ")\n",
    "query_engine = index.as_query_engine(llm=llm)\n",
    "tool = QueryEngineTool.from_defaults(\n",
    "    query_engine=query_engine,\n",
    "    name=\"some useful name\",\n",
    "    description=\"some useful description\",\n",
    ")\n",
    "await tool.acall(\n",
    "    \"Responds about research on the impact of AI on the future of work and society?\"\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Creating Toolspecs\n",
    "\n",
    "Let's create a `ToolSpec` from the `GmailToolSpec` from the LlamaHub and convert it to a list of tools. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[<llama_index.core.tools.function_tool.FunctionTool at 0x15d638410>,\n",
       " <llama_index.core.tools.function_tool.FunctionTool at 0x15d64a190>,\n",
       " <llama_index.core.tools.function_tool.FunctionTool at 0x159cf8050>,\n",
       " <llama_index.core.tools.function_tool.FunctionTool at 0x15d65b150>,\n",
       " <llama_index.core.tools.function_tool.FunctionTool at 0x15cef6990>,\n",
       " <llama_index.core.tools.function_tool.FunctionTool at 0x15d642b10>]"
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from llama_index.tools.google import GmailToolSpec\n",
    "\n",
    "tool_spec = GmailToolSpec()\n",
    "tool_spec_list = tool_spec.to_tool_list()\n",
    "tool_spec_list"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "To get a more detailed view of the tools, we can take a look at the `metadata` of each tool."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[('load_data',\n",
       "  \"load_data() -> List[llama_index.core.schema.Document]\\nLoad emails from the user's account.\"),\n",
       " ('search_messages',\n",
       "  \"search_messages(query: str, max_results: Optional[int] = None)\\nSearches email messages given a query string and the maximum number\\n        of results requested by the user\\n           Returns: List of relevant message objects up to the maximum number of results.\\n\\n        Args:\\n            query[str]: The user's query\\n            max_results (Optional[int]): The maximum number of search results\\n            to return.\\n        \"),\n",
       " ('create_draft',\n",
       "  \"create_draft(to: Optional[List[str]] = None, subject: Optional[str] = None, message: Optional[str] = None) -> str\\nCreate and insert a draft email.\\n           Print the returned draft's message and id.\\n           Returns: Draft object, including draft id and message meta data.\\n\\n        Args:\\n            to (Optional[str]): The email addresses to send the message to\\n            subject (Optional[str]): The subject for the event\\n            message (Optional[str]): The message for the event\\n        \"),\n",
       " ('update_draft',\n",
       "  \"update_draft(to: Optional[List[str]] = None, subject: Optional[str] = None, message: Optional[str] = None, draft_id: str = None) -> str\\nUpdate a draft email.\\n           Print the returned draft's message and id.\\n           This function is required to be passed a draft_id that is obtained when creating messages\\n           Returns: Draft object, including draft id and message meta data.\\n\\n        Args:\\n            to (Optional[str]): The email addresses to send the message to\\n            subject (Optional[str]): The subject for the event\\n            message (Optional[str]): The message for the event\\n            draft_id (str): the id of the draft to be updated\\n        \"),\n",
       " ('get_draft',\n",
       "  \"get_draft(draft_id: str = None) -> str\\nGet a draft email.\\n           Print the returned draft's message and id.\\n           Returns: Draft object, including draft id and message meta data.\\n\\n        Args:\\n            draft_id (str): the id of the draft to be updated\\n        \"),\n",
       " ('send_draft',\n",
       "  \"send_draft(draft_id: str = None) -> str\\nSends a draft email.\\n           Print the returned draft's message and id.\\n           Returns: Draft object, including draft id and message meta data.\\n\\n        Args:\\n            draft_id (str): the id of the draft to be updated\\n        \")]"
      ]
     },
     "execution_count": 15,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "[(tool.metadata.name, tool.metadata.description) for tool in tool_spec_list]"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": ".venv",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.11"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}