PatchTST (ETTh1)
onlineibm-granite/granite-timeseries-patchtst616K params | 512 context | $0.5000 input | $1.50 output
This checkpoint is IBM's hosted PatchTST reference model for ETTh1 rather than a broad multi-domain TSFM family card. It uses patch tokenization and Transformer blocks to model long-horizon time-series behavior efficiently, and the published checkpoint is tuned around the ETTh1 workload. It is most useful as a strong baseline and interpretable point of comparison against the larger foundation models in the catalog.
Model Classification
Family
PatchTST
Type
pretrained forecasting model
Pretrained forecasting checkpoint hosted on TSFM.ai as a reference model, but not positioned upstream as a general-purpose TSFM family.
Resources
Training Data
ETTh1 train split covering all seven ETTh1 channels, per the official model card.
Recommended For
- • Reference long-horizon forecasting baselines
- • Interpretable comparisons against stronger hosted TSFMs
Strengths
- • Strong and widely cited long-horizon patch-transformer baseline
- • Easy comparison point for benchmark readers
Limitations
- • This hosted checkpoint is dataset-specific rather than a universal foundation model
- • Narrower operational scope than the broader TSFM families in the catalog
Capabilities
forecastingmultivariatelong-horizon
Tags
ibmpatchtstbaselinelong-horizon
Specifications
- Parameters
- 616K
- Architecture
- PatchTST transformer with patch tokenization
- Context length
- 512
- Max output
- 1,024
- Avg latency
- n/a
- Uptime
- n/a
- Rate limit
- n/a
- Accelerator
- NVIDIA GPU
- Regions
- Virginia, US
- License
- n/a
Pricing
- Input / 1M tokens
- $0.5000
- Output / 1M tokens
- $1.50
Performance
- Average latency
- n/a
- Availability
- n/a
- Rate limit
- n/a