Peach-9B-8k-Roleplay
Property | Value |
---|---|
Parameter Count | 8.83B |
Model Type | Text Generation |
Architecture | LLaMA-based |
License | MIT |
Tensor Type | BF16 |
What is Peach-9B-8k-Roleplay?
Peach-9B-8k-Roleplay is an advanced language model fine-tuned from the Yi-1.5-9B base model, specifically optimized for roleplay and conversational tasks. With impressive benchmark scores of 66.19% on MMLU and 69.07% on CMMLU (5-shot), it represents one of the most capable models under 34B parameters.
Implementation Details
The model utilizes the Transformers library (version 4.37.2) and requires PyTorch 1.13.1 for optimal performance. It supports both Chinese and English languages and can handle context windows up to 8K tokens.
- Specialized for roleplay conversations with extensive training on 100K+ synthetic conversations
- Implements advanced generation parameters including temperature control and repetition penalty
- Supports chat template formatting for structured conversations
Core Capabilities
- Bilingual support (Chinese and English)
- 8K token context window
- Strong performance on general knowledge (MMLU/CMMLU)
- Optimized for character-based interactions
- Efficient deployment with bfloat16 precision
Frequently Asked Questions
Q: What makes this model unique?
The model stands out for its exceptional performance-to-parameter ratio, offering strong capabilities in roleplay and conversation while maintaining a relatively small 9B parameter size. Its bilingual proficiency and extended context window make it particularly versatile for various applications.
Q: What are the recommended use cases?
The model is best suited for character-based interactions, roleplaying scenarios, and general conversational tasks in both Chinese and English. However, users should note its limitations with mathematical tasks, coding, and complex logical reasoning due to its parameter size.