Benjamin Consolvo commited on
Commit
d1ff7ba
·
1 Parent(s): 248d80d

readme updates

Browse files
Files changed (1) hide show
  1. README.md +26 -72
README.md CHANGED
@@ -4,22 +4,21 @@ emoji: 🐨
4
  colorFrom: yellow
5
  colorTo: purple
6
  sdk: streamlit
7
- sdk_version: 1.44.1
8
  app_file: app.py
9
  pinned: false
10
  license: mit
11
  short_description: Let AI agents plan your next vacation!
12
  ---
13
 
14
- # 🏖️ VacAIgent: Streamlit-Integrated AI Crew for Trip Planning
15
 
16
- VacAIgent leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a user-friendly Streamlit interface. This project demonstrates how autonomous AI agents can collaborate and execute complex tasks efficiently.
17
 
18
- _Forked and enhanced from the_ [_crewAI examples repository_](https://github.com/joaomdmoura/crewAI-examples/tree/main/trip_planner). You can find the application hosted on Hugging Face Spaces here:
19
 
20
  [![](images/hf_vacaigent.png)](https://huggingface.co/spaces/Intel/vacaigent)
21
 
22
-
23
  **Check out the video below for code walkthrough** 👇
24
 
25
  <a href="https://youtu.be/nKG_kbQUDDE">
@@ -28,110 +27,65 @@ _Forked and enhanced from the_ [_crewAI examples repository_](https://github.com
28
 
29
  (_Trip example originally developed by [@joaomdmoura](https://x.com/joaomdmoura)_)
30
 
31
- ## CrewAI Framework
32
-
33
- CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a Streamlit user interface.
34
-
35
 
36
- ## Running the Application
37
 
38
- To experience the VacAIgent app:
39
 
40
  ### Pre-Requisites
41
- 1. Get the API key from **scrapinagent.com** from [scrapinagent](https://scrapingant.com/)
42
- 2. Get the API from **SERPER API** from [serper]( https://serper.dev/)
43
  3. Bring your OpenAI compatible API key
44
- 4. Bring your model endpoint URL and LLM model ID that you want to use
45
 
46
- ### Deploy Trip Planner
47
 
48
- #### Step 1
49
- Clone the repository:
50
  ```sh
51
  git clone https://github.com/opea-project/Enterprise-Inference/
52
  cd examples/vacaigent
53
  ```
54
-
55
- #### Step 2
56
-
57
- Insall Dependencies
58
  ```sh
59
  pip install -r requirements.txt
60
  ```
61
- #### Step 3
62
  Add Streamlit secrets. Create a `.streamlit/secrets.toml` file and update the variables below:
63
 
64
  ```sh
65
- SERPER_API_KEY=""
66
- SCRAPINGANT_API_KEY=""
67
- OPENAI_API_KEY=""
68
  MODEL_ID="meta-llama/Llama-3.3-70B-Instruct"
69
  MODEL_BASE_URL="https://api.inference.denvrdata.com/v1/"
70
-
71
  ```
72
- **Note**: You can alternatively add these secrets directly to Hugging Face Spaces Secrets, under the Settings tab, if deploying the Streamlit application directly on Hugging Face.
73
 
74
- #### Step 4
75
 
76
- Run the application
77
 
 
 
78
  ```sh
79
  streamlit run app.py
80
  ```
81
 
82
- Your application should be up and running in your web browser.
83
-
84
- ★ **Disclaimer**: The application uses meta-llama/Llama-3.3-70B-Instruct by default. Ensure you have access to an OpenAI-compatible API and be aware of any associated costs.
85
-
86
- ## Details & Explanation
87
-
88
- - **Components**:
89
  - [trip_tasks.py](trip_tasks.py): Contains task prompts for the agents.
90
  - [trip_agents.py](trip_agents.py): Manages the creation of agents.
91
  - [tools](tools) directory: Houses tool classes used by agents.
92
  - [app.py](app.py): The heart of the frontend Streamlit app.
93
 
94
- ## LLM Model
95
-
96
- To switch the LLM model being used, you can switch the `MODEL_ID` in the `.streamlit/secrets.toml` file.
97
-
98
  ## Using Local Models with Ollama
99
 
100
- For enhanced privacy and customization, you can integrate local models like Ollama:
101
-
102
- ### Setting Up Ollama
103
-
104
- - **Installation**: Follow Ollama's guide for installation.
105
- - **Configuration**: Customize the model as per your requirements.
106
-
107
- ### Integrating Ollama with CrewAI
108
-
109
- Pass the Ollama model to agents in the CrewAI framework:
110
-
111
- ```python
112
- from langchain.llms import Ollama
113
 
114
- ollama_model = Ollama(model="agent")
115
-
116
- class TripAgents:
117
- # ... existing methods
118
 
119
- def local_expert(self):
120
- return Agent(
121
- role='Local Expert',
122
- tools=[SearchTools.search_internet, BrowserTools.scrape_and_summarize_website],
123
- llm=ollama_model,
124
- verbose=True
125
- )
126
 
127
- ```
128
 
129
- ## Benefits of Local Models
130
 
131
- - **Privacy**: Process sensitive data in-house.
132
- - **Customization**: Tailor models to fit specific needs.
133
- - **Performance**: Potentially faster responses with on-premises models.
134
-
135
- ## License
136
 
137
- VacAIgent is open-sourced under the MIT license.
 
4
  colorFrom: yellow
5
  colorTo: purple
6
  sdk: streamlit
7
+ sdk_version: 1.45.1
8
  app_file: app.py
9
  pinned: false
10
  license: mit
11
  short_description: Let AI agents plan your next vacation!
12
  ---
13
 
14
+ # 🏖️ VacAIgent: Let AI agents plan your next vacation!
15
 
16
+ VacAIgent leverages the CrewAI agentic framework to automate and enhance the trip planning experience, integrating a user-friendly Streamlit interface. This project demonstrates how autonomous AI agents can collaborate and execute complex tasks efficiently. It takes advantage of the inference endpoint called [Intel® AI for Enterprise Inference](https://github.com/opea-project/Enterprise-Inference) with an OpenAI-compatible API key, hosted on cloud provider [Denvr Dataworks](https://www.denvrdata.com/intel).
17
 
18
+ _Forked and enhanced from the_ [_crewAI examples repository_](https://github.com/joaomdmoura/crewAI-examples/tree/main/trip_planner). You can find the application hosted on Hugging Face Spaces [here](https://huggingface.co/spaces/Intel/vacaigent):
19
 
20
  [![](images/hf_vacaigent.png)](https://huggingface.co/spaces/Intel/vacaigent)
21
 
 
22
  **Check out the video below for code walkthrough** 👇
23
 
24
  <a href="https://youtu.be/nKG_kbQUDDE">
 
27
 
28
  (_Trip example originally developed by [@joaomdmoura](https://x.com/joaomdmoura)_)
29
 
 
 
 
 
30
 
 
31
 
32
+ ## Installing and Using the Application
33
 
34
  ### Pre-Requisites
35
+ 1. Get the API key from **scrapinagent.com** from [scrapinagent](https://scrapingant.com/) for HTML web-scraping.
36
+ 2. Get the API from **SERPER API** from [serper]( https://serper.dev/) for Google Search API.
37
  3. Bring your OpenAI compatible API key
38
+ 4. Bring your model endpoint URL and LLM model ID
39
 
40
+ ### Installation steps
41
 
42
+ First, clone the repository:
 
43
  ```sh
44
  git clone https://github.com/opea-project/Enterprise-Inference/
45
  cd examples/vacaigent
46
  ```
47
+ Then, install the necessary libraries:
 
 
 
48
  ```sh
49
  pip install -r requirements.txt
50
  ```
 
51
  Add Streamlit secrets. Create a `.streamlit/secrets.toml` file and update the variables below:
52
 
53
  ```sh
54
+ SERPER_API_KEY="serper-api-key"
55
+ SCRAPINGANT_API_KEY="scrapingant_api_key"
56
+ OPENAI_API_KEY="open_api_key"
57
  MODEL_ID="meta-llama/Llama-3.3-70B-Instruct"
58
  MODEL_BASE_URL="https://api.inference.denvrdata.com/v1/"
 
59
  ```
 
60
 
61
+ Here we are using the model [meta-llama/Llama-3.3-70B-Instruct](https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct) by default, and the model endpoint is from Denvr Dataworks; but you can bring your own OpenAI-compatible API key, model ID, and model endpoint.
62
 
63
+ **Note**: You can alternatively add these secrets directly to Hugging Face Spaces Secrets, under the Settings tab, if deploying the Streamlit application directly on Hugging Face.
64
 
65
+ ### Run the application
66
+ To run the application locally, you should be able to execute this command to pull up a Streamlit interface in your web browser:
67
  ```sh
68
  streamlit run app.py
69
  ```
70
 
71
+ ### Components:
 
 
 
 
 
 
72
  - [trip_tasks.py](trip_tasks.py): Contains task prompts for the agents.
73
  - [trip_agents.py](trip_agents.py): Manages the creation of agents.
74
  - [tools](tools) directory: Houses tool classes used by agents.
75
  - [app.py](app.py): The heart of the frontend Streamlit app.
76
 
 
 
 
 
77
  ## Using Local Models with Ollama
78
 
79
+ For enhanced privacy and customization, you could easily substitute cloud-hosted models with locally-hosted models from [Ollama](https://ollama.com/).
 
 
 
 
 
 
 
 
 
 
 
 
80
 
81
+ ## License
 
 
 
82
 
83
+ VacAIgent is open-sourced under the MIT license.
 
 
 
 
 
 
84
 
85
+ ### Follow Up
86
 
87
+ Connect to LLMs on Intel® Gaudi® accelerators with just an endpoint and an OpenAI-compatible API key, courtesy of cloud-provider Denvr Dataworks: https://www.denvrdata.com/intel
88
 
89
+ Chat with 6K+ fellow developers on the Intel DevHub Discord: https://discord.gg/kfJ3NKEw5t
 
 
 
 
90
 
91
+ Connect with me on LinkedIn: https://linkedin.com/in/bconsolvo