Update README.md
Browse files
README.md
CHANGED
@@ -57,17 +57,17 @@ getting started [notebook](https://github.com/IBM/tsfm/blob/main/notebooks/hfdem
|
|
57 |
|
58 |
|
59 |
|
60 |
-
- **512-96-r2**: Given the last 512 time-points (i.e. context length), this model can forecast up to the next 96 time-points (i.e. forecast length)
|
61 |
in future. (branch name: main)
|
62 |
|
63 |
-
- **1024-96-r2**: Given the last 1024 time-points (i.e. context length), this model can forecast up to the next 96 time-points (i.e. forecast length)
|
64 |
-
in future. (branch name: 1024-96-r2) [[Benchmarks]]
|
65 |
|
66 |
-
- **1536-96-r2**: Given the last 1536 time-points (i.e. context length), this model can forecast up to the next 96 time-points (i.e. forecast length)
|
67 |
-
in future. (branch name: 1536-96-r2)
|
68 |
|
69 |
-
- Likewise, we have models released for forecast lengths up to 720 timepoints. The branch names for these are as follows: `512-192-r2`, `1024-192-r2`, `1536-192-r2`, `512-336-r2`,
|
70 |
-
`512-336-r2`, `1024-336-r2`, `1536-336-r2`, `512-720-r2`, `1024-720-r2`, `1536-720-r2`
|
71 |
|
72 |
- Please use the [[get_model]](https://github.com/ibm-granite/granite-tsfm/blob/main/tsfm_public/toolkit/get_model.py) utility to automatically select the required model based on your input context length and forecast length requirement.
|
73 |
|
@@ -79,6 +79,7 @@ but can provide any forecast lengths up to 720 in get_model() to get the require
|
|
79 |
|
80 |
The below model scripts can be used for any of the above TTM models. Please update the HF model URL and branch name in the `from_pretrained` call appropriately to pick the model of your choice.
|
81 |
|
|
|
82 |
- Getting Started [[colab]](https://colab.research.google.com/github/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)
|
83 |
- Zeroshot Multivariate Forecasting [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)
|
84 |
- Finetuned Multivariate Forecasting:
|
@@ -107,7 +108,7 @@ Please note that the Granite TTM models are pre-trained exclusively on datasets
|
|
107 |
with clear commercial-use licenses that are approved by our legal team. As a result, the pre-training dataset used in this release differs slightly from the one used in the research
|
108 |
paper, which may lead to minor variations in model performance as compared to the published results. Please refer to our paper for more details.
|
109 |
|
110 |
-
**Benchmarking Scripts: [here](https://github.com/ibm-granite/granite-tsfm/
|
111 |
|
112 |
|
113 |
## Recommended Use
|
|
|
57 |
|
58 |
|
59 |
|
60 |
+
- **512-96-ft-r2**: Given the last 512 time-points (i.e. context length), this model can forecast up to the next 96 time-points (i.e. forecast length)
|
61 |
in future. (branch name: main)
|
62 |
|
63 |
+
- **1024-96-ft-r2**: Given the last 1024 time-points (i.e. context length), this model can forecast up to the next 96 time-points (i.e. forecast length)
|
64 |
+
in future. (branch name: 1024-96-ft-r2) [[Benchmarks]]
|
65 |
|
66 |
+
- **1536-96-ft-r2**: Given the last 1536 time-points (i.e. context length), this model can forecast up to the next 96 time-points (i.e. forecast length)
|
67 |
+
in future. (branch name: 1536-96-ft-r2)
|
68 |
|
69 |
+
- Likewise, we have models released for forecast lengths up to 720 timepoints. The branch names for these are as follows: `512-192-ft-r2`, `1024-192-ft-r2`, `1536-192-ft-r2`, `512-336-r2`,
|
70 |
+
`512-336-ft-r2`, `1024-336-ft-r2`, `1536-336-ft-r2`, `512-720-ft-r2`, `1024-720-ft-r2`, `1536-720-ft-r2`
|
71 |
|
72 |
- Please use the [[get_model]](https://github.com/ibm-granite/granite-tsfm/blob/main/tsfm_public/toolkit/get_model.py) utility to automatically select the required model based on your input context length and forecast length requirement.
|
73 |
|
|
|
79 |
|
80 |
The below model scripts can be used for any of the above TTM models. Please update the HF model URL and branch name in the `from_pretrained` call appropriately to pick the model of your choice.
|
81 |
|
82 |
+
Since these models use frequency prefix tuning, ensure your dataset yaml (as mentioned in the below notebooks) have frequency information and set `enable_prefix_tuning` to True in load_dataset.
|
83 |
- Getting Started [[colab]](https://colab.research.google.com/github/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)
|
84 |
- Zeroshot Multivariate Forecasting [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)
|
85 |
- Finetuned Multivariate Forecasting:
|
|
|
108 |
with clear commercial-use licenses that are approved by our legal team. As a result, the pre-training dataset used in this release differs slightly from the one used in the research
|
109 |
paper, which may lead to minor variations in model performance as compared to the published results. Please refer to our paper for more details.
|
110 |
|
111 |
+
**Benchmarking Scripts: [here](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/tinytimemixer/full_benchmarking/research-use-r2.sh)**
|
112 |
|
113 |
|
114 |
## Recommended Use
|