Salesforce logo

Moirai-1.0-R-Large

online
Salesforce/moirai-1.0-R-large

311M params | 512 context | $0.5000 input | $1.50 output

Moirai-1.0-R-Large is the highest-capacity dense model in the original Moirai family. It preserves the same masked-encoder, any-variate forecasting design as the smaller checkpoints, but at a scale aimed at stronger broad zero-shot performance. It is the dense Moirai option to pick when you want the first-generation architecture at its highest published capacity.

Model Classification

Family

Moirai-1.0-R

Type

time series foundation model

Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.

Training Data

LOTSA, the Large-scale Open Time Series Archive, with roughly 27B observations across nine domains including energy, transport, finance, healthcare, sales, climate, web, and social data.

Recommended For

  • Multivariate forecasting across heterogeneous domains
  • Workloads that benefit from probabilistic outputs and arbitrary variate counts

Strengths

  • Strong multivariate coverage across the Moirai family
  • Well-suited to covariates and correlated series

Limitations

  • Model cards for some newer Moirai variants are still sparse on exact checkpoint details
  • Heavier family choices can be more expensive than tiny single-purpose baselines

Capabilities

forecastingquantile-forecastingmultivariatecovariateszero-shot

Tags

salesforcemoiraimultivariatequality-tier

Specifications

Parameters
311M
Architecture
masked encoder transformer with multi-patch projections, any-variate attention, and mixture-distribution output
Context length
512
Max output
1,024
Avg latency
n/a
Uptime
n/a
Rate limit
n/a
Accelerator
NVIDIA GPU
Regions
Virginia, US
License
n/a

Pricing

Input / 1M tokens
$0.5000
Output / 1M tokens
$1.50

Performance

Average latency
n/a
Availability
n/a
Rate limit
n/a