--- title: Llm Bill Chat emoji: 🥸🧮 colorFrom: indigo colorTo: red sdk: streamlit sdk_version: 1.41.1 app_file: One_model.py pinned: false license: apache-2.0 short_description: 'LLM app ' --- # LLM Bill Chat App This project is a proof of concept for a chat application utilizing a Large Language Model (LLM) to assist users with their telecom billing inquiries. The application is built using Python and Streamlit, providing an interactive web interface for users to engage with. ## Features - Maintain chat conversation context (history) - Allow users to query their billing information - Compare the last bills and provide insights - Respond exclusively with the user's own billing information - Augment the prompt instructions with user's text recognized entities - NER. - Save user information and conversation history ## Project Structure llm-bill-chat-app ├── src │ ├── chat │ │ ├── __init__.py │ │ ├── context.py │ │ ├── bill_comparison.py │ │ ├── user_info.py │ │ └── conversation.py │ └── utils │ └── __init__.py ├── bill.py # Streamlit app initialization for utils module ├── requirements.txt └── README.md ## Installation 1. Clone the repository: ``` git remote add origin https://github.com/serbantica/llm-bill-chat.git cd llm-chat-app ``` 2. Create and activate a virtual environment (Windows example): ``` python -m venv .venv .venv\Scrips\activate ``` 3. Install the required dependencies: ``` pip install -r requirements.txt ``` ## Usage To run the application, execute the following command: ``` streamlit run bill.py ``` Open your web browser and navigate to `http://localhost:8501` to interact with the chat application. ## Contributing Contributions are welcome! Please feel free to submit a pull request or open an issue for any suggestions or improvements. ## License This project is licensed under the MIT License. See the LICENSE file for more details.