Mantis-8M
Property | Value |
---|---|
Author | paris-noah |
Model Type | Time Series Classification Foundation Model |
Paper | arXiv:2502.15637 |
Repository | HuggingFace |
What is Mantis-8M?
Mantis-8M is a lightweight foundation model developed by Huawei Noah's Ark Lab specifically designed for time series classification tasks. It offers a unique combination of efficiency and ease of use, featuring seamless integration with popular machine learning frameworks like scikit-learn.
Implementation Details
The model is implemented in Python and can be easily installed via pip using 'mantis-tsfm'. It leverages modern deep learning techniques while maintaining a relatively small parameter count of 8M, making it efficient for both training and inference.
- Pre-trained foundation model architecture optimized for time series data
- Flexible adapter system for handling multi-channel inputs
- Compatible with GPU acceleration through CUDA support
- Scikit-learn style API for easy integration into existing workflows
Core Capabilities
- Feature extraction from time series data
- Fine-tuning on custom datasets
- Dimension reduction through PCA-based adapters
- Probability-based predictions with calibrated outputs
- Support for high-dimensional time series through channel reduction
Frequently Asked Questions
Q: What makes this model unique?
Mantis-8M stands out for its lightweight architecture combined with powerful capabilities for time series classification. It offers a unique adapter system for handling high-dimensional data and maintains compatibility with popular machine learning workflows.
Q: What are the recommended use cases?
The model is ideal for time series classification tasks, especially when dealing with multi-channel data. It's particularly useful when you need to handle dimensionality reduction while maintaining classification accuracy, or when you require a pre-trained foundation model that can be easily fine-tuned for specific domains.