IBM logo

PatchTST-FM R1

online
ibm-granite/granite-timeseries-patchtst-fm-r1

~325M params | 512 context | $0.5000 input | $1.50 output | Apache-2.0

PatchTST-FM R1 is IBM's foundation-model extension of PatchTST, trained for broad zero-shot forecasting rather than a single dataset. It features an 8192-token context window, a 99-quantile probabilistic output head, and simultaneous imputation-and-forecasting capabilities for series with missing values. IBM positions it as a top-5 replicable zero-shot model on the GIFT-Eval leaderboard as of March 2026.

Model Classification

Family

PatchTST-FM / Granite TimeSeries

Type

time series foundation model

Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.

Training Data

Broad multi-domain pretraining corpus for zero-shot generalization; IBM positions the model on the GIFT-Eval leaderboard without disclosing the full mixture.

Recommended For

  • Zero-shot probabilistic forecasting across diverse domains
  • Long-context forecasting with simultaneous imputation of missing values

Strengths

  • 8192-token context enables very long lookback windows
  • 99-quantile output provides rich uncertainty estimates without sampling

Limitations

  • Large model footprint requires GPU inference
  • Newer foundation-model release with less production track record than the original PatchTST baseline

Capabilities

forecastingquantile-forecastingimputationzero-shotlong-context

Tags

ibmpatchtstfoundation-modelprobabilisticlong-context

Specifications

Parameters
~325M
Architecture
PatchTST transformer with 8192-token context and 99-quantile output head
Context length
512
Max output
1,024
Avg latency
n/a
Uptime
n/a
Plan limits
1,000 rpm free · 1,000,000 rpm with billing
Accelerator
NVIDIA GPU
Regions
Virginia, US
License
Apache-2.0

Pricing

Input / 1M tokens
$0.5000
Output / 1M tokens
$1.50

Performance

Average latency
n/a
Availability
n/a
Plan limits
1,000 rpm free · 1,000,000 rpm with billing