wmgifford commited on
Commit
11e404e
·
verified ·
1 Parent(s): 7b69a48

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -41,16 +41,16 @@ R1 and R2 variants and pick the best for your data.
41
 
42
  - **512-96-r2**: Given the last 512 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
43
  in future. This model is pre-trained with a larger pretraining dataset for improved accuracy. Recommended for hourly and minutely
44
- resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 512-96-r2) [[Benchmarks]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/tinytimemixer/ttm-r2_benchmarking_512_96.ipynb)
45
 
46
  - **1024-96-r2**: Given the last 1024 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
47
  in future. This model is pre-trained with a larger pretraining dataset for improved accuracy. Recommended for hourly and minutely
48
- resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 1024-96-r2) [[Benchmarks]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/tinytimemixer/ttm-r2_benchmarking_1024_96.ipynb)
49
 
50
 
51
  - **1536-96-r2**: Given the last 1536 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
52
  in future. This model is pre-trained with a larger pretraining dataset for improved accuracy. Recommended for hourly and minutely
53
- resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 1536-96-r2) [[Benchmarks]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/tinytimemixer/ttm-r2_benchmarking_1536_96.ipynb)
54
 
55
 
56
 
@@ -59,15 +59,15 @@ R1 and R2 variants and pick the best for your data.
59
  The below model scripts can be used for any of the above TTM models. Please update the HF model URL and branch name in the `from_pretrained` call appropriately to pick the model of your choice.
60
 
61
  - Getting Started [[colab]](https://colab.research.google.com/github/IBM/tsfm/blob/main/notebooks/tutorial/ttm_tutorial.ipynb)
62
- - Zeroshot Multivariate Forecasting [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/ttm_getting_started.ipynb)
63
  - Finetuned Multivariate Forecasting:
64
- - Channel-Independent Finetuning [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/ttm_getting_started.ipynb) [M4-Hourly finetuning](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/tinytimemixer/ttm_m4_hourly.ipynb)
65
- - Channel-Mix Finetuning [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/tutorial/ttm_channel_mix_finetuning.ipynb)
66
  - **New Releases (extended features released on October 2024)**
67
- - Finetuning and Forecasting with Exogenous/Control Variables [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/tutorial/ttm_with_exog_tutorial.ipynb)
68
  - Finetuning and Forecasting with static categorical features [Example: To be added soon]
69
- - Rolling Forecasts - Extend forecast lengths beyond 96 via rolling capability [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/hfdemo/ttm_rolling_prediction_getting_started.ipynb)
70
- - Helper scripts for optimal Learning Rate suggestions for Finetuning [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/ttm_v2_release/notebooks/tutorial/ttm_with_exog_tutorial.ipynb)
71
 
72
  ## Benchmarks
73
 
 
41
 
42
  - **512-96-r2**: Given the last 512 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
43
  in future. This model is pre-trained with a larger pretraining dataset for improved accuracy. Recommended for hourly and minutely
44
+ resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 512-96-r2) [[Benchmarks]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/tinytimemixer/ttm-r2_benchmarking_512_96.ipynb)
45
 
46
  - **1024-96-r2**: Given the last 1024 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
47
  in future. This model is pre-trained with a larger pretraining dataset for improved accuracy. Recommended for hourly and minutely
48
+ resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 1024-96-r2) [[Benchmarks]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/tinytimemixer/ttm-r2_benchmarking_1024_96.ipynb)
49
 
50
 
51
  - **1536-96-r2**: Given the last 1536 time-points (i.e. context length), this model can forecast up to next 96 time-points (i.e. forecast length)
52
  in future. This model is pre-trained with a larger pretraining dataset for improved accuracy. Recommended for hourly and minutely
53
+ resolutions (Ex. 10 min, 15 min, 1 hour, etc). (branch name: 1536-96-r2) [[Benchmarks]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/tinytimemixer/ttm-r2_benchmarking_1536_96.ipynb)
54
 
55
 
56
 
 
59
  The below model scripts can be used for any of the above TTM models. Please update the HF model URL and branch name in the `from_pretrained` call appropriately to pick the model of your choice.
60
 
61
  - Getting Started [[colab]](https://colab.research.google.com/github/IBM/tsfm/blob/main/notebooks/tutorial/ttm_tutorial.ipynb)
62
+ - Zeroshot Multivariate Forecasting [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb)
63
  - Finetuned Multivariate Forecasting:
64
+ - Channel-Independent Finetuning [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_getting_started.ipynb) [M4-Hourly finetuning](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/tinytimemixer/ttm_m4_hourly.ipynb)
65
+ - Channel-Mix Finetuning [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/tutorial/ttm_channel_mix_finetuning.ipynb)
66
  - **New Releases (extended features released on October 2024)**
67
+ - Finetuning and Forecasting with Exogenous/Control Variables [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/tutorial/ttm_with_exog_tutorial.ipynb)
68
  - Finetuning and Forecasting with static categorical features [Example: To be added soon]
69
+ - Rolling Forecasts - Extend forecast lengths beyond 96 via rolling capability [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/hfdemo/ttm_rolling_prediction_getting_started.ipynb)
70
+ - Helper scripts for optimal Learning Rate suggestions for Finetuning [[Example]](https://github.com/ibm-granite/granite-tsfm/blob/main/notebooks/tutorial/ttm_with_exog_tutorial.ipynb)
71
 
72
  ## Benchmarks
73