Spaces:
Runtime error
A newer version of the Gradio SDK is available:
5.6.0
title: Interactive Chat Application
emoji: 💬
colorFrom: indigo
colorTo: indigo
sdk: gradio
sdk_version: 4.26.0
app_file: app.py
pinned: false
LM Studio Gradio Chat
This repository contains a web-based chat application that integrates with various AI models from LM Studio, such as Mistral, OpenAI, and Llama, via a user-friendly Gradio interface. It is designed to maintain conversation history, providing a coherent and continuous AI chat experience comparable to systems like ChatGPT or Claude.
Features
- Multiple AI Model Integration: Integrates with a variety of conversational AI models hosted on LM Studio, including Mistral, OpenAI, Llama, and more, offering flexibility in choosing the AI technology.
- Real-time Interaction: Engage with different AI models in real-time through a Gradio interface for a dynamic chat experience.
- Contextual Conversations: Maintains a conversation history to provide context to the AI models, enabling more coherent and meaningful interactions, similar to advanced systems like ChatGPT or Claude.
- User-Friendly Interface: Simple and intuitive UI built with Gradio, making it accessible for users with any level of technical expertise.
Installation
To get started with this project, you'll need to set up your Python environment and install the necessary dependencies.
Prerequisites
- Python 3.8 or higher
- Access to LM Studio with a running instance of the Mistral model
Setup
Clone this repository:
git clone https://huggingface.co/your-username/your-repo-name cd your-repo-name
Install the required Python libraries:
pip install -r requirements.txt
Launch the Application:
python main.py
Usage
Once you have completed the installation, you can start the application by running:
python main.py
This will launch the Gradio interface in your default web browser, where you can interact with the Mistral model directly.
Contributing
Contributions to this project are welcome! Please feel free to fork the repository, make your changes, and submit a pull request.
License
This project is licensed under the MIT License - see the LICENSE file for details.