Spaces:
Sleeping
Sleeping
Sam
commited on
Commit
·
13c75a7
1
Parent(s):
4726801
Updated README
Browse files
README.md
CHANGED
@@ -1,152 +1,52 @@
|
|
1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
|
3 |
-
#
|
4 |
|
5 |
-
|
6 |
|
7 |
-
|
8 |
|
9 |
-
|
|
|
|
|
|
|
10 |
|
11 |
-
|
12 |
|
13 |
-
|
14 |
|
15 |
-
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
-
|
|
|
|
|
|
|
18 |
|
19 |
-
|
|
|
|
|
20 |
|
21 |
-
|
|
|
|
|
22 |
|
23 |
-
|
24 |
|
25 |
-
|
|
|
26 |
|
27 |
-
|
28 |
-
|
29 |
-

|
30 |
-
|
31 |
-
Create a `Protected` endpoint.
|
32 |
-
|
33 |
-

|
34 |
-
|
35 |
-
If you were successful, you should see the following screen:
|
36 |
-
|
37 |
-

|
38 |
-
|
39 |
-
#### Embedding Model Endpoint
|
40 |
-
We'll be using `Snowflake/snowflake-arctic-embed-m` for our embedding model today.
|
41 |
-
|
42 |
-
The process is the same as the LLM - but we'll make a few specific tweaks:
|
43 |
-
|
44 |
-
Let's make sure our set-up reflects the following screenshots:
|
45 |
-
|
46 |
-

|
47 |
-
|
48 |
-
After which, make sure the advanced configuration is set like so:
|
49 |
-
|
50 |
-

|
51 |
-
|
52 |
-
> #### NOTE: PLEASE SHUTDOWN YOUR INSTANCES WHEN YOU HAVE COMPLETED THE ASSIGNMENT TO PREVENT UNESSECARY CHARGES.
|
53 |
-
|
54 |
-
### Build Task 2: Create RAG Pipeline with LangChain
|
55 |
-
|
56 |
-
Follow the [notebook](https://colab.research.google.com/drive/1v1FYmvKH4gsqcdZwIT9wvbQe0GUjrc9d?usp=sharing) to create a LangChain pipeline powered by Hugging Face endpoints!
|
57 |
-
|
58 |
-
Once you're done - please move on to Build Task 3!
|
59 |
-
|
60 |
-
### Build Task 3: Create a Chainlit Application
|
61 |
-
|
62 |
-
1. Create a new empty Docker space through Hugging Face - with the following settings:
|
63 |
-
|
64 |
-

|
65 |
-
|
66 |
-
> NOTE: You may notice the application builds slowly (~15min.) with the default free-tier hardware. The process will be faster using the `CPU upgrade` Space Hardware - though it is not required.
|
67 |
-
|
68 |
-
2. Clone the newly created space into a directory that is *NOT IN YOUR AI MAKERSPACE REPOSITORY* using the SSH option.
|
69 |
-
|
70 |
-
> NOTE: You may need to ensure you've added your SSH key to Hugging Face, as well as GitHub. This should already be done.
|
71 |
-
|
72 |
-

|
73 |
-
|
74 |
-
3. Copy and Paste (`cp ...` or through UI) the contents of `Week 4/Day 1` into the newly cloned repository.
|
75 |
-
|
76 |
-
> NOTE: Please keep the `README.md` that was cloned from your space and delete the class `README.md`.
|
77 |
-
|
78 |
-
4. Using the `ls` command or the `tree` command verify that you have copied over:
|
79 |
-
- `app.py`
|
80 |
-
- `Dockerfile`
|
81 |
-
- `data/paul_graham_essays.txt`
|
82 |
-
- `chainlit.md`
|
83 |
-
- `.gitignore`
|
84 |
-
- `.env.sample`
|
85 |
-
- `solution_app.py`
|
86 |
-
- `requirements.txt`
|
87 |
-
|
88 |
-
Here is an example as the `ls -al` CLI command:
|
89 |
-
|
90 |
-

|
91 |
-
|
92 |
-
5. Work through the `app.py` file to migrate your LCEL LangChain RAG Chain from the Notebook to Chainlit!
|
93 |
-
|
94 |
-
6. Be sure to modify your `README.md` and `chainlit.md` as you see fit!
|
95 |
-
|
96 |
-
> NOTE: If you get stuck, there is a working reference version in `solution_app.py`.
|
97 |
-
|
98 |
-
7. When you are done with local testing - push your changes to your space.
|
99 |
-
|
100 |
-
8. Make sure you add your `HF_LLM_ENDPOINT`, `HF_EMBED_ENDPOINT`, `HF_TOKEN` as "Secrets" in your Hugging Face Space.
|
101 |
-
|
102 |
-
### Terminating Your Resources
|
103 |
-
|
104 |
-
Please head to the settings of each endpoint and select `Delete Endpoint`. You will need to type the name of the endpoint to delete the resources.
|
105 |
-
|
106 |
-
### Deliverables
|
107 |
-
|
108 |
-
- Completed Notebook
|
109 |
-
- Chainlit Application in a Hugging Face Space Powered by Hugging Face Endpoints
|
110 |
-
- Screenshot of endpoint usage
|
111 |
-
|
112 |
-
Example Screen Shot:
|
113 |
-
|
114 |
-

|
115 |
-
|
116 |
-
## Ship 🚢
|
117 |
-
|
118 |
-
Create a Hugging Face Space powered by Hugging Face Endpoints!
|
119 |
-
|
120 |
-
### Deliverables
|
121 |
-
|
122 |
-
- A short Loom of the space, and a 1min. walkthrough of the application in full
|
123 |
-
|
124 |
-
## Share 🚀
|
125 |
-
|
126 |
-
Make a social media post about your final application!
|
127 |
-
|
128 |
-
### Deliverables
|
129 |
-
|
130 |
-
- Make a post on any social media platform about what you built!
|
131 |
-
|
132 |
-
Here's a template to get you started:
|
133 |
-
|
134 |
-
```
|
135 |
-
🚀 Exciting News! 🚀
|
136 |
-
|
137 |
-
I am thrilled to announce that I have just built and shipped a open-source LLM-powered Retrieval Augmented Generation Application with LangChain! 🎉🤖
|
138 |
-
|
139 |
-
🔍 Three Key Takeaways:
|
140 |
-
1️⃣
|
141 |
-
2️⃣
|
142 |
-
3️⃣
|
143 |
-
|
144 |
-
Let's continue pushing the boundaries of what's possible in the world of AI and question-answering. Here's to many more innovations! 🚀
|
145 |
-
Shout out to @AIMakerspace !
|
146 |
-
|
147 |
-
#LangChain #QuestionAnswering #RetrievalAugmented #Innovation #AI #TechMilestone
|
148 |
-
|
149 |
-
Feel free to reach out if you're curious or would like to collaborate on similar projects! 🤝🔥
|
150 |
-
```
|
151 |
-
|
152 |
-
> #### NOTE: PLEASE SHUTDOWN YOUR INSTANCES WHEN YOU HAVE COMPLETED THE ASSIGNMENT TO PREVENT UNESSECARY CHARGES.
|
|
|
1 |
+
---
|
2 |
+
title: Midterm App
|
3 |
+
emoji: 🏠
|
4 |
+
colorFrom: blue
|
5 |
+
colorTo: purple
|
6 |
+
sdk: docker
|
7 |
+
app_file: app.py
|
8 |
+
pinned: false
|
9 |
+
---
|
10 |
|
11 |
+
# Midterm App
|
12 |
|
13 |
+
This is the Midterm App, a project developed for the AI Engineering course. The application leverages Chainlit, LangChain, OpenAI, and Qdrant to perform retrieval-augmented generation (RAG) on a PDF document.
|
14 |
|
15 |
+
## Features
|
16 |
|
17 |
+
- **Document Loading**: Loads and splits a PDF document into manageable chunks.
|
18 |
+
- **Embeddings and Retrieval**: Uses OpenAI embeddings and Qdrant for efficient document retrieval.
|
19 |
+
- **Question Answering**: Answers questions based on the context retrieved from the document.
|
20 |
+
- **Chainlit Integration**: Provides a chat interface for users to interact with the application.
|
21 |
|
22 |
+
## Setup
|
23 |
|
24 |
+
To set up and run the application locally, follow these steps:
|
25 |
|
26 |
+
1. **Clone the repository**:
|
27 |
+
git clone https://huggingface.co/spaces/sampazar/midterm-app
|
28 |
+
cd midterm-app
|
29 |
+
2. Build and run the Docker container:
|
30 |
+
docker build -t midterm-app .
|
31 |
+
docker run -p 7860:7860 midterm-app
|
32 |
|
33 |
+
## Requirements
|
34 |
+
Make sure you have the following dependencies installed if you are not using Docker:
|
35 |
+
Python 3.9
|
36 |
+
pip
|
37 |
|
38 |
+
## Dependencies
|
39 |
+
The application depends on several Python packages, which are listed in the requirements.txt file. You can install them using:
|
40 |
+
pip install -r requirements.txt
|
41 |
|
42 |
+
## Usage
|
43 |
+
Run the application with:
|
44 |
+
chainlit run app.py --port 7860
|
45 |
|
46 |
+
Once the application is running, you can access it in your browser at http://localhost:7860.
|
47 |
|
48 |
+
## Contributing
|
49 |
+
Contributions are welcome! Please feel free to submit a Pull Request.
|
50 |
|
51 |
+
## License
|
52 |
+
This project is licensed under the MIT License.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|