Moirai-1.0-R-small
Property | Value |
---|---|
Parameter Count | 13.8M parameters |
License | CC-BY-NC-4.0 |
Paper | arXiv:2402.02592 |
Tensor Type | F32 |
What is moirai-1.0-R-small?
Moirai-1.0-R-small is a compact yet powerful Masked Encoder-based Universal Time Series Forecasting Transformer, designed for accurate time series prediction. As part of the Moirai family, this small variant contains 13.8M parameters and has been pre-trained on the comprehensive LOTSA dataset.
Implementation Details
The model implements a sophisticated architecture that processes multivariate time series through patch embeddings, combining sequence and variate IDs within a transformer framework. It supports flexible prediction lengths and context windows, with customizable patch sizes ranging from 8 to 128.
- Supports both target variables and dynamic covariates
- Utilizes mixture distribution parameters for forecasting
- Implements patchification with configurable patch sizes
- Offers batch processing capabilities
Core Capabilities
- Universal time series forecasting across various domains
- Handles multi-variate time series data
- Supports dynamic covariates in forecast horizon
- Provides probabilistic forecasts through mixture distributions
Frequently Asked Questions
Q: What makes this model unique?
The model's architecture combines patch-based processing with transformer technology, allowing it to handle various time series forecasting tasks while maintaining a relatively small parameter count of 13.8M, making it efficient for deployment.
Q: What are the recommended use cases?
The model is ideal for time series forecasting tasks requiring efficient processing and accurate predictions, particularly when working with multiple variables and known dynamic covariates. It's suitable for applications where computational resources are limited but prediction quality cannot be compromised.