Timer-base-84m
Property | Value |
---|---|
Parameter Count | 84M |
Architecture | Causal Transformer (Decoder-only) |
Context Length | 2880 |
Pre-training Scale | 260B time points |
License | Apache-2.0 |
Author | thuml |
What is timer-base-84m?
Timer-base-84m is a lightweight generative Transformer model specifically designed for zero-shot time series forecasting. Developed by thuml, this model represents a significant advancement in time series analysis, having been pre-trained on an impressive 260 billion time points.
Implementation Details
The model is built on a Causal Transformer architecture with 8 layers and supports a context length of up to 2880 time points. It utilizes a patch length of 96 and has been optimized for efficient time series prediction tasks. The implementation is compatible with the Transformers library version 4.40.1 and Python 3.10.
- 8-layer Transformer architecture
- 84M trainable parameters
- Patch length of 96
- Maximum context length of 2880
- Pre-trained on 260B time points
Core Capabilities
- Zero-shot point forecasting
- Efficient time series prediction
- Handles varying sequence lengths
- Compatible with TSLib Dataset benchmarks
Frequently Asked Questions
Q: What makes this model unique?
Timer-base-84m stands out for its efficient architecture that enables zero-shot forecasting without requiring task-specific fine-tuning, while maintaining a relatively small parameter count of 84M compared to other large language models.
Q: What are the recommended use cases?
The model is particularly well-suited for time series forecasting tasks, especially when dealing with sequences up to 2880 time points. It's designed for point forecasting and can generate predictions for future time points based on historical data.