Cydonia-24B-v2.1
Property | Value |
---|---|
Base Model | Mistral 2501 |
Parameter Count | 24 Billion |
Model Type | Language Model |
Author | TheDrummer |
Primary Use | Text Generation, Chat |
What is Cydonia-24B-v2.1?
Cydonia-24B-v2.1 is an advanced language model that builds upon Mistral's latest 'Small' model (2501). It represents a significant fine-tuning effort focused on enhancing prose quality, fostering creative outputs, and achieving a more balanced tone by reducing excessive positivity. The model maintains compatibility with multiple chat templates, making it versatile for various applications.
Implementation Details
The model is available in multiple formats, including the original version, GGUF format, iMatrix implementation (recommended), and EXL2 version. It supports several chat templates, with Mistral v7 Tekken being the recommended format for optimal performance.
- Built on Mistral's 2501 architecture
- Specialized fine-tuning for improved prose generation
- Multiple format availability for different deployment scenarios
- Optimized template support system
Core Capabilities
- Enhanced prose generation and creative writing
- Support for multiple chat templates (Mistral v7 Tekken, Metharme, Alpaca)
- Balanced response tone with reduced artificial positivity
- Versatile deployment options through various formats
Frequently Asked Questions
Q: What makes this model unique?
This model stands out through its specialized fine-tuning that enhances prose quality and creativity while maintaining a more natural tone. The multiple format availability and template support make it particularly versatile for different use cases.
Q: What are the recommended use cases?
The model is particularly well-suited for creative writing, chat applications, and general text generation tasks. It performs optimally with the Mistral v7 Tekken template and can be effectively used for story generation with the Alpaca template.