Salesforce logo

Moirai-MoE-1.0-R-Small

online
Salesforce/moirai-moe-1.0-R-small

~0.47B stored params params | 512 context | $0.5000 input | $1.50 output | CC-BY-NC-4.0

Moirai-MoE-1.0-R-Small is the lightweight variant of Salesforce's sparse expert Moirai-MoE family. It shares the same MoE routing architecture as the Base variant but with fewer parameters, offering the most cost-efficient way to access the Moirai-MoE design. Note: this model is released under a non-commercial license.

Model Classification

Family

Moirai-MoE-1.0-R

Type

time series foundation model

Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.

Training Data

Official checkpoint card is sparse; Salesforce's official Moirai-MoE materials describe large heterogeneous time-series pretraining in the Moirai-MoE setup rather than a separate narrow corpus for this exact checkpoint.

Recommended For

  • Multivariate forecasting across heterogeneous domains
  • Workloads that benefit from probabilistic outputs and arbitrary variate counts

Strengths

  • Strong multivariate coverage across the Moirai family
  • Well-suited to covariates and correlated series

Limitations

  • Model cards for some newer Moirai variants are still sparse on exact checkpoint details
  • Heavier family choices can be more expensive than tiny single-purpose baselines

Capabilities

forecastingquantile-forecastingmultivariatezero-shothigh-throughput

Tags

salesforcemoiraimoesparseefficient

Specifications

Parameters
~0.47B stored params
Architecture
sparse MoE decoder-only transformer with probabilistic output heads
Context length
512
Max output
1,024
Avg latency
n/a
Uptime
n/a
Plan limits
1,000 rpm free · 1,000,000 rpm with billing
Accelerator
NVIDIA GPU
Regions
Virginia, US
License
CC-BY-NC-4.0

Pricing

Input / 1M tokens
$0.5000
Output / 1M tokens
$1.50

Performance

Average latency
n/a
Availability
n/a
Plan limits
1,000 rpm free · 1,000,000 rpm with billing