BuddyGlassUncensored2025.4
Property | Value |
---|---|
Base Model | Mistral-Small-24B-Instruct-2501 |
Author | darkc0de |
Merge Method | DARE TIES |
Model URL | Hugging Face |
What is BuddyGlassUncensored2025.4?
BuddyGlassUncensored2025.4 is an advanced language model created through a sophisticated merge of multiple 24B parameter models using the DARE TIES methodology. Built on the foundation of Mistral-Small-24B-Instruct-2501, it combines the capabilities of four powerful models: Dolphin3.0-Mistral-24B, Mistral-Small-24B-Instruct-2501-abliterated, Cydonia-24B-v2, and Arcee-Blitz-abliterated.
Implementation Details
The model employs a balanced merging strategy where each constituent model contributes equally with a density and weight of 0.5. The implementation uses float16 precision and includes int8 masking, optimizing both performance and resource utilization.
- Uniform density and weight distribution (0.5) across all merged models
- Float16 dtype implementation for efficient processing
- Int8 masking enabled for optimization
- Non-normalized merge architecture
Core Capabilities
- Enhanced instruction following from Mistral base model
- Balanced performance characteristics from multiple model merging
- Optimized for both efficiency and accuracy
- Comprehensive language understanding from diverse model sources
Frequently Asked Questions
Q: What makes this model unique?
The model's uniqueness lies in its balanced merge of four sophisticated 24B parameter models using the DARE TIES method, with each component contributing equally to the final model's capabilities.
Q: What are the recommended use cases?
Given its architecture and merged capabilities, this model is well-suited for advanced language understanding tasks, instruction-following applications, and scenarios requiring balanced performance across multiple domains.