MN-12B-Mag-Mell-R1
Property | Value |
---|---|
Parameter Count | 12.2B |
Model Type | Text Generation |
Architecture | Mistral-based Merged Model |
Tensor Type | BF16 |
Papers | DARE, TIES |
What is MN-12B-Mag-Mell-R1?
MN-12B-Mag-Mell-R1 is an advanced language model created through a sophisticated multi-stage SLERP merge process, combining seven carefully selected Mistral-based models. Named after the Celtic Otherworld, this model represents a culmination of "Best of Nemo" capabilities, specifically designed for creative and fictional applications.
Implementation Details
The model employs a unique three-part merger strategy: Hero (focused on RP and trope coverage), Monk (emphasizing intelligence and groundedness), and Deity (specializing in prose and literary flair). It uses ChatML formatting and has been optimized with DARE-TIES merge methodology.
- Recommended settings: Temperature 1.25 and MinP 0.2
- ChatML formatting strongly recommended
- Implements advanced DARE-TIES merge architecture
- Uses BF16 precision for optimal performance
Core Capabilities
- Advanced worldbuilding comparable to classic adventuring models
- High-quality prose generation with minimal artifacts
- Creative metaphor generation
- Balanced intelligence and artistic expression
- Enhanced fictional narrative capabilities
Frequently Asked Questions
Q: What makes this model unique?
The model's distinctive three-part merge architecture combines specialized capabilities for roleplay, intelligence, and literary prowess, creating a versatile tool for creative writing and worldbuilding.
Q: What are the recommended use cases?
This model excels in creative writing, fictional narrative generation, worldbuilding, and any application requiring a balance of intellectual depth and artistic expression. It's particularly well-suited for interactive storytelling and complex narrative development.