Moirai-1.0-R-Base
onlineSalesforce/moirai-1.0-R-base91M params | 512 context | $0.5000 input | $1.50 output
Moirai-1.0-R-Base is the reference dense checkpoint in the original Moirai family. It keeps the any-variate masked-encoder design while scaling model capacity enough to improve general forecasting quality on heterogeneous multivariate settings. It is a good default dense Moirai checkpoint when you want stronger accuracy than the small model without jumping to the much larger large variant.
Model Classification
Family
Moirai-1.0-R
Type
time series foundation model
Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.
Resources
Training Data
LOTSA, the Large-scale Open Time Series Archive, with roughly 27B observations across nine domains including energy, transport, finance, healthcare, sales, climate, web, and social data.
Recommended For
- • Multivariate forecasting across heterogeneous domains
- • Workloads that benefit from probabilistic outputs and arbitrary variate counts
Strengths
- • Strong multivariate coverage across the Moirai family
- • Well-suited to covariates and correlated series
Limitations
- • Model cards for some newer Moirai variants are still sparse on exact checkpoint details
- • Heavier family choices can be more expensive than tiny single-purpose baselines
Capabilities
Tags
Specifications
- Parameters
- 91M
- Architecture
- masked encoder transformer with multi-patch projections, any-variate attention, and mixture-distribution output
- Context length
- 512
- Max output
- 1,024
- Avg latency
- n/a
- Uptime
- n/a
- Rate limit
- n/a
- Accelerator
- NVIDIA GPU
- Regions
- Virginia, US
- License
- n/a
Pricing
- Input / 1M tokens
- $0.5000
- Output / 1M tokens
- $1.50
Performance
- Average latency
- n/a
- Availability
- n/a
- Rate limit
- n/a