Lag-Llama

Maintained By
time-series-foundation-models

Lag-Llama

PropertyValue
Parameter Count2.45M
LicenseApache 2.0
Paperarxiv:2310.08278
Tensor TypeF32

What is Lag-Llama?

Lag-Llama represents a groundbreaking development in time series forecasting as the first open-source foundation model specifically designed for this purpose. This innovative model combines probabilistic forecasting capabilities with the flexibility to handle various time series frequencies and prediction lengths.

Implementation Details

The model architecture is built with 2.45M parameters and utilizes F32 tensor types. It's designed to be both efficient and adaptable, supporting both zero-shot forecasting and fine-tuning capabilities. The implementation includes RoPE scaling for handling larger context lengths than the training data.

  • Supports variable context lengths (32 to 1024)
  • Probabilistic output distribution for each predicted timestep
  • Flexible frequency handling for any time series data

Core Capabilities

  • Zero-shot forecasting on any frequency dataset
  • Fine-tuning capabilities with customizable learning rates
  • Context length optimization for improved performance
  • Early stopping support with validation split options

Frequently Asked Questions

Q: What makes this model unique?

Lag-Llama is the first open-source foundation model specifically designed for time series forecasting, offering both zero-shot capabilities and fine-tuning options with probabilistic outputs.

Q: What are the recommended use cases?

The model is ideal for time series forecasting tasks across various frequencies. It's recommended to first benchmark zero-shot performance on your data before considering fine-tuning, which can further improve results with sufficient training data.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.