April 9, 2026

modelscatalog

New Models: Timer-S1, MOMENT Small & Base, Moirai-MoE Small, PatchTST-FM

Five new time-series foundation models added to the hosted catalog

#New Models

We have added five new models to the TSFM.ai hosted catalog:

#Timer-S1

ByteDance's Timer-S1 is a billion-scale sparse MoE TSFM with 8.3B total parameters (0.75B active per token), an 11.5K context window, and native 9-quantile probabilistic output. It achieves state-of-the-art MASE and CRPS on the GIFT-Eval leaderboard. Apache-2.0 licensed.

  • bytedance-research/Timer-S1

#MOMENT-1-Small & MOMENT-1-Base

AutonLab's MOMENT-Small (~40M params) and MOMENT-Base (~125M params) join the existing MOMENT-Large, giving you a full size ladder for the MOMENT family. All three share the same multi-task architecture supporting forecasting, classification, anomaly detection, and imputation. MIT licensed.

  • AutonLab/MOMENT-1-small
  • AutonLab/MOMENT-1-base

#Moirai-MoE-1.0-R-Small

Salesforce's Moirai-MoE-1.0-R-Small fills out the sparse-expert Moirai-MoE line below the existing Base variant. It offers the same MoE routing architecture with fewer parameters for faster inference. Note: released under CC-BY-NC-4.0.

  • Salesforce/moirai-moe-1.0-R-small

#PatchTST-FM R1

IBM's PatchTST-FM R1 is the foundation-model evolution of PatchTST, trained for broad zero-shot forecasting with an 8192-token context window and 99-quantile probabilistic output. IBM positions it as a top-5 replicable zero-shot model on the GIFT-Eval leaderboard. Apache-2.0 licensed.

  • ibm-granite/granite-timeseries-patchtst-fm-r1