Salesforce logo

Moirai-MoE-1.0-R-Base

online
Salesforce/moirai-moe-1.0-R-base

~0.94B stored params params | 512 context | $0.5000 input | $1.50 output

Moirai-MoE-1.0-R-Base is Salesforce's sparse expert extension of the Moirai line. Instead of relying on a single dense model for all behaviors, it routes tokens through specialized experts to improve parameter efficiency and specialization across heterogeneous series. Salesforce's official Moirai-MoE materials position the Base variant as a top zero-shot performer in the family while keeping inference cheaper than an equally sized dense alternative.

Model Classification

Family

Moirai-MoE-1.0-R

Type

time series foundation model

Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.

Training Data

Official checkpoint card is sparse; Salesforce's official Moirai-MoE materials describe large heterogeneous time-series pretraining in the Moirai-MoE setup rather than a separate narrow corpus for this exact checkpoint.

Recommended For

  • Multivariate forecasting across heterogeneous domains
  • Workloads that benefit from probabilistic outputs and arbitrary variate counts

Strengths

  • Strong multivariate coverage across the Moirai family
  • Well-suited to covariates and correlated series

Limitations

  • Model cards for some newer Moirai variants are still sparse on exact checkpoint details
  • Heavier family choices can be more expensive than tiny single-purpose baselines

Capabilities

forecastingquantile-forecastingmultivariatezero-shothigh-throughput

Tags

salesforcemoiraimoesparse

Specifications

Parameters
~0.94B stored params
Architecture
sparse MoE decoder-only transformer with probabilistic output heads
Context length
512
Max output
1,024
Avg latency
n/a
Uptime
n/a
Rate limit
n/a
Accelerator
NVIDIA GPU
Regions
Virginia, US
License
n/a

Pricing

Input / 1M tokens
$0.5000
Output / 1M tokens
$1.50

Performance

Average latency
n/a
Availability
n/a
Rate limit
n/a