wmgifford commited on
Commit
9d3a4de
1 Parent(s): dc31cd1
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -174,7 +174,7 @@ fewshot_output = finetune_forecast_trainer.evaluate(dset_test)
174
 
175
  ## Training Data
176
 
177
- The TTM models were trained on a collection of datasets from the Monash Time Series Forecasting repository. The datasets used include:
178
  - Australian Electricity Demand: https://zenodo.org/records/4659727
179
  - Australian Weather: https://zenodo.org/records/4654822
180
  - Bitcoin dataset: https://zenodo.org/records/5122101
@@ -187,10 +187,12 @@ The TTM models were trained on a collection of datasets from the Monash Time Ser
187
  - US Births: https://zenodo.org/records/4656049
188
  - Wind Farms Production data: https://zenodo.org/records/4654858
189
  - Wind Power: https://zenodo.org/records/4656032
190
- - [to be updated]
 
 
191
 
192
 
193
- ## Citation [optional]
194
  Kindly cite the following paper, if you intend to use our model or its associated architectures/approaches in your
195
  work
196
 
 
174
 
175
  ## Training Data
176
 
177
+ The original r1 TTM models were trained on a collection of datasets from the Monash Time Series Forecasting repository. The datasets used include:
178
  - Australian Electricity Demand: https://zenodo.org/records/4659727
179
  - Australian Weather: https://zenodo.org/records/4654822
180
  - Bitcoin dataset: https://zenodo.org/records/5122101
 
187
  - US Births: https://zenodo.org/records/4656049
188
  - Wind Farms Production data: https://zenodo.org/records/4654858
189
  - Wind Power: https://zenodo.org/records/4656032
190
+ In addition to the above datasets, the updated TTM model (512-96-r2) was trained on the following:
191
+ - PEMSD3, PEMSD4, PEMSD7, PEMSD8, PEMS_BAY: https://drive.google.com/drive/folders/1g5v2Gq1tkOq8XO0HDCZ9nOTtRpB6-gPe
192
+ - LOS_LOOP: https://drive.google.com/drive/folders/1g5v2Gq1tkOq8XO0HDCZ9nOTtRpB6-gPe
193
 
194
 
195
+ ## Citation
196
  Kindly cite the following paper, if you intend to use our model or its associated architectures/approaches in your
197
  work
198