A
Kairos-23M
onlinemldi-lab/Kairos_23m23M params | 2K context | $0.00025 per forecast
Kairos-23M is the mid-size public Kairos checkpoint from the Ant Group and ShanghaiTech University Kairos release. It keeps the same adaptive tokenization and instance-specific positional encoding strategy as the rest of the family, while offering more capacity than the 10M model for broader zero-shot coverage. It is a good middle ground when you want the Kairos design without jumping all the way to the 50M checkpoint.
Model Classification
Family
Kairos
Type
time series foundation model
Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.
Resources
Training Data
PreSTS corpus with 300B+ time points, as documented by the official Kairos model cards and project page.
Recommended For
- • Adaptive zero-shot forecasting across heterogeneous series
- • Teams that want a mid-size open family with modern tokenization ideas
Strengths
- • Adaptive tokenization handles changing information density well
- • Clear parameter-size ladder from small to larger public checkpoints
Limitations
- • Newer family with a smaller production footprint than the most established lines
- • Focused on forecasting rather than general multi-task time-series tooling
Capabilities
forecastingzero-shotadaptive-tokenization
Tags
kairosadaptivezero-shot
Specifications
- Parameters
- 23M
- Architecture
- encoder-decoder transformer with adaptive patching and instance-adaptive rotary position encoding
- Context length
- 2,048
- Max context
- 2,048
- Minimum history
- n/a
- Recommended history
- n/a
- Input step
- n/a
- Required target series
- 1
- Temperature
- Ignored
- Top P
- Ignored
- Max output
- 1,024
- Avg latency
- n/a
- Uptime
- n/a
- Plan limits
- 1,000 rpm free · 1,000,000 rpm with billing
- Accelerator
- T4
- Regions
- Virginia, US
- License
- n/a
Pricing
- Per forecast
- $0.00025
Performance
- Average latency
- n/a
- Availability
- n/a
- Plan limits
- 1,000 rpm free · 1,000,000 rpm with billing