MLDI Lab logo

Kairos-23M

online
mldi-lab/Kairos_23m

23M params | 512 context | $0.5000 input | $1.50 output

Kairos-23M is the mid-size public Kairos checkpoint. It keeps the same adaptive tokenization and instance-specific positional encoding strategy as the rest of the family, while offering more capacity than the 10M model for broader zero-shot coverage. It is a good middle ground when you want the Kairos design without jumping all the way to the 50M checkpoint.

Model Classification

Family

Kairos

Type

time series foundation model

Pretrained time-series model exposed on TSFM.ai for zero-shot or few-shot forecasting workloads.

Training Data

PreSTS corpus with 300B+ time points, as documented by the official Kairos model cards and project page.

Recommended For

  • Adaptive zero-shot forecasting across heterogeneous series
  • Teams that want a mid-size open family with modern tokenization ideas

Strengths

  • Adaptive tokenization handles changing information density well
  • Clear parameter-size ladder from small to larger public checkpoints

Limitations

  • Newer family with a smaller production footprint than the most established lines
  • Focused on forecasting rather than general multi-task time-series tooling

Capabilities

forecastingzero-shotadaptive-tokenization

Tags

kairosadaptivezero-shot

Specifications

Parameters
23M
Architecture
encoder-decoder transformer with adaptive patching and instance-adaptive rotary position encoding
Context length
512
Max output
1,024
Avg latency
n/a
Uptime
n/a
Rate limit
n/a
Accelerator
NVIDIA GPU
Regions
Virginia, US
License
n/a

Pricing

Input / 1M tokens
$0.5000
Output / 1M tokens
$1.50

Performance

Average latency
n/a
Availability
n/a
Rate limit
n/a