Alibaba Cloud logo

TimeMoE-50M

online
Maple728/TimeMoE-50M

512 context | $0.5000 input | $1.50 output

TimeMoE-50M is hosted on TSFM.ai in TSFM-US-01 for production forecasting workloads.

Model Classification

Family

TimeMoE

Type

n/a

Recommended For

  • Long-context forecasting with sparse-expert scaling
  • Teams exploring MoE behavior in time-series foundation models

Strengths

  • Sparse experts offer large-capacity behavior without a fully dense footprint
  • Well aligned to long-context autoregressive forecasting

Limitations

  • MoE operational behavior can be less familiar than dense baselines
  • Not the best first pick if you just need a simple compact deployment

Capabilities

forecastingquantile-forecasting

Tags

timemoe

Specifications

Parameters
n/a
Architecture
n/a
Context length
512
Max output
1,024
Avg latency
n/a
Uptime
n/a
Plan limits
1,000 rpm free · 1,000,000 rpm with billing
Accelerator
T4
Regions
n/a
License
n/a

Pricing

Input / 1M tokens
$0.5000
Output / 1M tokens
$1.50

Performance

Average latency
n/a
Availability
n/a
Plan limits
1,000 rpm free · 1,000,000 rpm with billing