Blog
Insights on time series foundation models, forecasting techniques, and the TSFM.ai platform.
Real-Time Streaming Inference with TSFMs: Moving from Batch to Continuous Forecasting
Most TSFM deployments run forecasts in hourly or daily batches, but a growing class of use cases demands continuous, low-latency predictions. Here's how to architect streaming inference pipelines for time series foundation models.
2026
4 articlesThe 2026 TSFM Toolkit: Which Foundation Model for Which Job?
With 18+ time series foundation models now available, choosing the right one for your workload is the real challenge. Here's a practitioner's decision framework.
Multimodal Time Series: Combining Numerical Data with Text and Natural Language Context
What if forecast models could read a description of your data alongside the numbers? Multimodal time series research is exploring how natural language context — domain descriptions, known events, business constraints — can improve forecasts, especially in zero-shot settings.
The TSFM Landscape in 2026: Trends and Predictions
As we enter 2026, the time series foundation model ecosystem is maturing rapidly. Here are the trends shaping the next year.
Network Traffic Forecasting and Telecom Capacity Planning with TSFMs
Telecom and network operators generate some of the richest time series data on earth — CDN throughput, cell tower load, DNS query rates — and time series foundation models can forecast it without per-metric training.
2025
20 articlesScaling TSFM Inference: GPU Optimization
Serving TSFMs at scale requires careful GPU optimization. Here's how we achieve sub-100ms latency for batch forecasting.
Toto: Datadog's Domain-Specific TSFM for Observability
Datadog's Toto is a 151M-parameter transformer trained on trillions of observability data points, purpose-built for forecasting infrastructure metrics like CPU utilization, error rates, and request latency.
TiRex: Nixtla's Covariate-Native Large-Scale Forecasting Transformer
Nixtla's TiRex brings native covariate support to large-scale time series forecasting with a dedicated encoder for exogenous regressors, a 16K context window, and strong zero-shot performance on covariate-rich datasets.
Smart Model Routing: Choosing the Best TSFM
Not every time series needs the same model. TSFM.ai's routing engine automatically selects the best foundation model for each request.
Predictive Maintenance in Manufacturing with TSFMs
Time series foundation models enable predictive maintenance across thousands of machine types without training individual models, catching bearing degradation, tool wear, and compressor decay before costly failures occur.
LLM Reprogramming for Time Series: Time-LLM's Prompt-as-Prefix Approach
Time-LLM reprograms a frozen LLaMA-7B to forecast time series by mapping patches into the LLM's embedding space and prepending natural language task descriptions, raising a provocative question about what large language models actually learn.
Supply Chain and Logistics Forecasting with TSFMs
Time series foundation models tackle the toughest supply chain forecasting problems, from the bullwhip effect to cold-start routes, enabling leaner inventory and fewer stockouts across complex logistics networks.
How Retailers Use TSFMs for Demand Planning
From inventory optimization to markdown pricing, time series foundation models are transforming how retailers forecast demand.
Healthcare Applications of Time Series Foundation Models
From ICU vital sign monitoring to hospital capacity planning, time series foundation models address healthcare's most pressing data challenges — starting with the chronic scarcity of labeled clinical data.
MOMENT: CMU's Model for Time Series Understanding
MOMENT from Carnegie Mellon is a multi-task time series foundation model that handles forecasting, classification, anomaly detection, and imputation.
Climate and Weather Forecasting with Time Series Foundation Models
Time series foundation models are finding a practical niche in climate and weather applications — not replacing physics-based models, but filling gaps they leave behind.
Context Length in TSFMs: How Much History Do Foundation Models Need?
Time series foundation models accept anywhere from 4K to 16K input time steps, but more context is not always better. We break down when longer history helps, when it hurts, and how to choose the right lookback for your data.
The State of Multivariate Forecasting in 2025
Multivariate time series forecasting remains one of the hardest problems in ML. Here's where foundation models stand in 2025.
Diffusion Models for Time Series: Inside Tsinghua's Sundial
Sundial applies flow-matching diffusion to time series forecasting, producing full predictive distributions from noise in fewer steps than standard diffusion models.
Covariates in Time Series Forecasting: A Practical Guide to Exogenous Regressors
Covariates like holidays, promotions, and weather can dramatically improve forecast accuracy. Here's how modern TSFMs handle them and when they actually help.
Fine-Tuning vs. Zero-Shot: When to Customize
Zero-shot TSFMs are powerful out of the box, but sometimes fine-tuning on your data delivers a meaningful accuracy boost. Here's how to decide.
Tiny Models, Big Results: IBM's Granite TTM and the MLP-Mixer Architecture for Time Series
IBM's Granite TTM packs competitive forecasting accuracy into roughly 1 million parameters by replacing attention with MLP-Mixer layers, enabling sub-100ms inference on CPU and opening the door to edge and serverless deployment.
Chronos v2: What's New and Why It Matters
Amazon's Chronos v2 (Chronos-Bolt) brings major improvements: 250x faster inference, encoder-only architecture, and stronger benchmark results.
Conformal Prediction: Calibrated Uncertainty Intervals for Time Series Foundation Models
Model-native prediction intervals are often miscalibrated. Conformal prediction provides a distribution-free wrapper that turns any forecaster's output into intervals with guaranteed coverage.
Case Study: Energy Demand Forecasting
How a European energy utility used TSFM.ai to improve demand forecasts by 23% compared to their existing gradient boosted tree pipeline.
2024
15 articlesIntroducing TSFM.ai: The Unified API for Time Series
We're launching TSFM.ai — a single API that gives you access to every major time series foundation model with automatic routing and optimization.
The Challenges of Benchmarking TSFMs
Benchmarking time series foundation models is harder than it looks. Here's why results often conflict and what the field is doing about it.
Time-MoE: Alibaba's Mixture-of-Experts Architecture for Time Series Forecasting
Alibaba's Time-MoE brings sparse mixture-of-experts to time series forecasting, activating only 200M of its 2.4B parameters per input to achieve large-model capacity at small-model inference cost.
Lag-Llama: The Open-Source Time Series Foundation Model
Lag-Llama brings the decoder-only LLM architecture to time series with lag-based tokenization and distributional outputs.
Prediction Intervals vs. Point Forecasts
Why a single predicted number is rarely enough, and how prediction intervals help you make better decisions under uncertainty.
Synthetic Training Data for TSFMs: KernelSynth and Gaussian Process Augmentation
How KernelSynth uses Gaussian process priors with composed kernels to generate synthetic time series, and why roughly half of Chronos's training data is artificially generated.
Building Production Forecast Pipelines with TSFMs
A practical guide to moving time series foundation models from notebooks to production-grade forecasting systems.
Financial Time Series: Volatility Modeling and Risk Forecasting with TSFMs
Financial markets produce some of the most challenging time series data. Here's how time series foundation models handle volatility clustering, tail risk estimation, and regulatory risk forecasting.
Moirai: Salesforce's Universal Forecasting Transformer
Moirai from Salesforce introduces a universal forecasting transformer that handles variable frequencies, prediction lengths, and multivariate inputs.
Tokenization Strategies Compared: Quantization vs Patching vs Lag-Based
Time series foundation models must bridge the gap between continuous signals and discrete transformer inputs. We compare three dominant tokenization strategies — quantization, patching, and lag-based — and when each one works best.
Anomaly Detection with Time Series Foundation Models
Foundation models aren't just for forecasting — they're surprisingly effective at detecting anomalies in time series data.
TimesFM: Google's Approach to Time Series Foundation Models
Google's TimesFM is a decoder-only foundation model for time series, trained on 100B real-world time points from Google Trends and Wikipedia.
Zero-Shot Forecasting: Why It Matters
Zero-shot forecasting lets you generate predictions on unseen time series without any training. Here's why that's a game-changer.
A Deep Dive into Amazon Chronos
How Amazon's Chronos turns time series forecasting into a language modeling problem using tokenized values and T5 architectures.
What Are Time Series Foundation Models?
An introduction to time series foundation models — what they are, how they work, and why they represent a paradigm shift in forecasting.