Collection of Bamba - hybrid Mamba2 model architecture based models trained on open data
ibm-ai-platform
company
AI & ML interests
None defined yet.
Recent Activity
View all activity
Organization Card
Foundation Model Stack
Foundation Model Stack (fms) is a collection of components developed out of IBM Research used for development, inference, training, and tuning of foundation models leveraging PyTorch native components.
Optimizations
In FMS, we aim to bring the latest optimizations for pre-training/inference/fine-tuning to all of our models. A few of these optimizations include, but are not limited to:
- fully compilable models with no graph breaks
- full tensor-parallel support for all applicable modules developed in fms
- training scripts leveraging FSDP
- state of the art light-weight speculators for improving inference performance
Usage
FMS is currently being deployed in Text Generation Inference Server
Repositories
- foundation-model-stack: Main repository for which all fms models are based
- fms-extras: New features staged to be integrated with foundation-model-stack
- fms-fsdp: Pre-Training Examples using FSDP wrapped foundation models
- fms-hf-tuning: Basic Tuning scripts for fms models leveraging SFTTrainer
Collections
2
spaces
1
models
14
ibm-fms/Bamba-9B-2T-fp8
Text Generation
•
Updated
•
137
•
2
ibm-fms/Bamba-9B-1.8T-fp8
Text Generation
•
Updated
•
148
•
1
ibm-fms/Bamba-9B-fp8
Text Generation
•
Updated
•
79
•
2
ibm-fms/Bamba-9B-2T
Text Generation
•
Updated
•
529
•
3
ibm-fms/Bamba-9B-1.8T
Text Generation
•
Updated
•
1.64k
•
2
ibm-fms/Bamba-9B
Text Generation
•
Updated
•
2.04k
•
19
ibm-fms/llama3-70b-accelerator
Updated
•
872
•
5
ibm-fms/llama2-70b-accelerator
Updated
•
8
•
1
ibm-fms/llama-160m-accelerator
Updated
•
24.1k
ibm-fms/codellama-34b-accelerator
Updated
•
6
datasets
None public yet