Spaces:
Runtime error
A newer version of the Gradio SDK is available:
5.25.2
title: Materials AI App
emoji: π
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.23.1
app_file: app.py
pinned: false
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
Materials AI App
Materials AI App is a production-grade, full-stack web application designed for advanced text mining and data extraction in the materials science and battery research domains. It leverages domain-specific models (MatSciBERT, MaterialBERT, BatteryBERT) along with OpenAI's GPTβ4 for natural language summarization and Q&A. The application uses FastAPI for the backend API and Gradio for an interactive demo interface, and it is deployed on Hugging Face Spaces.
Features
Domain-Specific Text Mining:
Extract entities and relations (e.g., material properties, synthesis methods) from scientific literature using specialized BERT models.Interactive Q&A:
Ask questions related to materials science and battery research and receive context-aware answers.Summarization:
Generate plain-language summaries of research text with GPTβ4.User-Friendly Interface:
A Gradio-based demo allows for quick testing and visualization of extracted information.Scalable & Secure:
Built using FastAPI with containerization (Docker) and CI/CD practices, ensuring robust production deployment on Hugging Face Spaces.
Repository Structure
/materials-ai-app βββ app β βββ init.py β βββ main.py # FastAPI backend server β βββ models.py # Domain model loading and inference functions β βββ openai_integration.py # OpenAI helper functions (e.g., summarization) β βββ utils.py # Utility functions (e.g., text/PDF parsing) βββ gradio_app.py # Gradio demo interface for interactive Q&A and extraction βββ Dockerfile # Docker configuration for containerization βββ requirements.txt # List of Python dependencies βββ README.md # Project documentation (this file)
bash Copy
Installation
- Clone the Repository:
git clone https://huggingface.co/spaces/mgbam/materials-ai-app cd materials-ai-app Set Up a Virtual Environment:
bash Copy python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate Install Dependencies:
bash Copy pip install -r requirements.txt Configure Environment Variables: Set the following environment variables (e.g., in a .env file or your shell):
OPENAI_API_KEY: Your OpenAI API key.
API_URL: URL of your FastAPI backend (if different from localhost).
Running Locally Backend API Start the FastAPI server:
bash Copy uvicorn app.main:app --reload Access the API at http://localhost:8000.
Gradio Demo In another terminal, run:
bash Copy python gradio_app.py This launches the interactive Gradio interface for testing extraction and summarization.
Deployment This repository is configured for deployment on Hugging Face Spaces. Simply push your changes to the repository, and Hugging Face Spaces will build and deploy your app automatically.
Alternatively, you can use the provided Dockerfile:
bash Copy docker build -t materials-ai-app . docker run -p 8000:8000 materials-ai-app Contributing Contributions, issues, and feature requests are welcome! Please check the issues page for more details.
References MatSciBERT on Hugging Face
Hugging Face Spaces Documentation
OpenAI API Documentation
License This project is licensed under the MIT License.
Contact For any questions or feedback, please reach out to [[email protected]].
yaml Copy
You can adjust this template to suit your project's specifics and include any additional sections that might be relevant.