timer-base-84m

Maintained By
thuml

Timer-base-84m

PropertyValue
Parameter Count84M
ArchitectureCausal Transformer (Decoder-only)
Context Length2880
Pre-training Scale260B time points
LicenseApache-2.0
Authorthuml

What is timer-base-84m?

Timer-base-84m is a lightweight generative Transformer model specifically designed for zero-shot time series forecasting. Developed by thuml, this model represents a significant advancement in time series analysis, having been pre-trained on an impressive 260 billion time points.

Implementation Details

The model is built on a Causal Transformer architecture with 8 layers and supports a context length of up to 2880 time points. It utilizes a patch length of 96 and has been optimized for efficient time series prediction tasks. The implementation is compatible with the Transformers library version 4.40.1 and Python 3.10.

  • 8-layer Transformer architecture
  • 84M trainable parameters
  • Patch length of 96
  • Maximum context length of 2880
  • Pre-trained on 260B time points

Core Capabilities

  • Zero-shot point forecasting
  • Efficient time series prediction
  • Handles varying sequence lengths
  • Compatible with TSLib Dataset benchmarks

Frequently Asked Questions

Q: What makes this model unique?

Timer-base-84m stands out for its efficient architecture that enables zero-shot forecasting without requiring task-specific fine-tuning, while maintaining a relatively small parameter count of 84M compared to other large language models.

Q: What are the recommended use cases?

The model is particularly well-suited for time series forecasting tasks, especially when dealing with sequences up to 2880 time points. It's designed for point forecasting and can generate predictions for future time points based on historical data.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.