ttm-research-r2

Maintained By
ibm

TTM-Research-R2

PropertyValue
Parameter Count855k parameters
LicenseCC-BY-NC-SA-4.0
PaperarXiv:2401.03955
Tensor TypeF32

What is ttm-research-r2?

TTM-research-r2 is a groundbreaking compact pre-trained model for Multivariate Time-Series Forecasting developed by IBM Research. As one of the first "tiny" pre-trained models in its field, it achieves remarkable performance while maintaining a minimal footprint of just 855k parameters. The model specializes in time series forecasting for minutely to hourly data resolutions, offering both zero-shot and few-shot forecasting capabilities.

Implementation Details

The model implements a focused pre-training approach, where each TTM variant is optimized for specific forecasting settings based on context and forecast lengths. It supports multiple configurations, from 512 to 1536 context lengths and forecast horizons up to 720 time points. The architecture emphasizes efficiency and practical deployment, allowing execution even on CPU-only machines.

  • Supports both channel independence and channel-mixing approaches for multivariate forecasting
  • Enables exogenous variable infusion and categorical data integration
  • Requires standard scaling of input data per channel
  • Optimized for minutely and hourly resolution data

Core Capabilities

  • Zero-shot forecasting without additional training
  • Fine-tuning with minimal target data (as low as 5%)
  • Multiple context length support (512, 1024, 1536)
  • Forecast horizons from 96 to 720 time points
  • Efficient execution on standard hardware

Frequently Asked Questions

Q: What makes this model unique?

TTM's uniqueness lies in its extremely compact size (855k parameters) while outperforming models with billions of parameters. It's the first of its kind to introduce "tiny" pre-trained models for time series forecasting, making it highly accessible and deployable.

Q: What are the recommended use cases?

The model is ideal for minutely to hourly time series forecasting tasks, particularly in scenarios requiring quick deployment and limited computational resources. It's best suited for multivariate time series with standard-scaled data and minimum context lengths of 512 time points.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.