File size: 1,464 Bytes
d09118e
 
45ce02f
 
 
d09118e
 
 
 
fb4cd02
45ce02f
fb4cd02
45ce02f
 
 
fb4cd02
45ce02f
 
 
 
 
 
 
 
fb4cd02
45ce02f
 
 
fb4cd02
 
45ce02f
fb4cd02
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
title: README
emoji: πŸš€
colorFrom: indigo
colorTo: blue
sdk: static
pinned: false
---

# IBM AI Platform

IBM's AI Platform is a collection of components developed out of IBM Research used for development, inference, training, and tuning of foundation models leveraging PyTorch native components. 

## Optimizations

In this platform, we aim to bring the latest optimizations for pre-training/inference/fine-tuning to all of our models. A few of these optimizations include, but are not limited to:

- fully compilable models with no graph breaks
- full tensor-parallel support for all applicable modules developed in fms
- training scripts leveraging FSDP
- state of the art light-weight speculators for improving inference performance

## Usage

Components such as speculative decoding have been deployed to [vLLM](https://docs.vllm.ai/en/latest/getting_started/examples/mlpspeculator.html)

## Repositories

- [foundation-model-stack](https://github.com/foundation-model-stack/foundation-model-stack): Main repository for which all AI platform models are based
- [fms-extras](https://github.com/foundation-model-stack/fms-extras): New features staged to be integrated with our AI platform
- [fms-fsdp](https://github.com/foundation-model-stack/fms-fsdp): Pre-Training Examples using FSDP wrapped foundation models
- [fms-hf-tuning](https://github.com/foundation-model-stack/fms-hf-tuning): Basic Tuning scripts for AI platform models leveraging SFTTrainer