Spaces:
Runtime error
Runtime error
File size: 3,728 Bytes
072f790 2399615 072f790 3195c92 2399615 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 |
---
title: Materials AI App
emoji: π
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.23.1
app_file: app.py
pinned: false
---
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
# Materials AI App
Materials AI App is a production-grade, full-stack web application designed for advanced text mining and data extraction in the materials science and battery research domains. It leverages domain-specific models (MatSciBERT, MaterialBERT, BatteryBERT) along with OpenAI's GPTβ4 for natural language summarization and Q&A. The application uses FastAPI for the backend API and Gradio for an interactive demo interface, and it is deployed on Hugging Face Spaces.
## Features
- **Domain-Specific Text Mining:**
Extract entities and relations (e.g., material properties, synthesis methods) from scientific literature using specialized BERT models.
- **Interactive Q&A:**
Ask questions related to materials science and battery research and receive context-aware answers.
- **Summarization:**
Generate plain-language summaries of research text with GPTβ4.
- **User-Friendly Interface:**
A Gradio-based demo allows for quick testing and visualization of extracted information.
- **Scalable & Secure:**
Built using FastAPI with containerization (Docker) and CI/CD practices, ensuring robust production deployment on Hugging Face Spaces.
## Repository Structure
/materials-ai-app βββ app β βββ init.py β βββ main.py # FastAPI backend server β βββ models.py # Domain model loading and inference functions β βββ openai_integration.py # OpenAI helper functions (e.g., summarization) β βββ utils.py # Utility functions (e.g., text/PDF parsing) βββ gradio_app.py # Gradio demo interface for interactive Q&A and extraction βββ Dockerfile # Docker configuration for containerization βββ requirements.txt # List of Python dependencies βββ README.md # Project documentation (this file)
bash
Copy
## Installation
1. **Clone the Repository:**
```bash
git clone https://huggingface.co/spaces/mgbam/materials-ai-app
cd materials-ai-app
Set Up a Virtual Environment:
bash
Copy
python -m venv venv
source venv/bin/activate # On Windows use: venv\Scripts\activate
Install Dependencies:
bash
Copy
pip install -r requirements.txt
Configure Environment Variables: Set the following environment variables (e.g., in a .env file or your shell):
OPENAI_API_KEY: Your OpenAI API key.
API_URL: URL of your FastAPI backend (if different from localhost).
Running Locally
Backend API
Start the FastAPI server:
bash
Copy
uvicorn app.main:app --reload
Access the API at http://localhost:8000.
Gradio Demo
In another terminal, run:
bash
Copy
python gradio_app.py
This launches the interactive Gradio interface for testing extraction and summarization.
Deployment
This repository is configured for deployment on Hugging Face Spaces. Simply push your changes to the repository, and Hugging Face Spaces will build and deploy your app automatically.
Alternatively, you can use the provided Dockerfile:
bash
Copy
docker build -t materials-ai-app .
docker run -p 8000:8000 materials-ai-app
Contributing
Contributions, issues, and feature requests are welcome! Please check the issues page for more details.
References
MatSciBERT on Hugging Face
Hugging Face Spaces Documentation
OpenAI API Documentation
License
This project is licensed under the MIT License.
Contact
For any questions or feedback, please reach out to [[email protected]].
yaml
Copy
---
You can adjust this template to suit your project's specifics and include any additional sections that might be relevant. |