llama-3.2-Korean-Bllossom-3B
Property | Value |
---|---|
Parameter Count | 3.21B |
Model Type | Bilingual LLM (Korean-English) |
Base Model | Meta-Llama-3.2-3B |
License | LLaMA 3.2 |
Paper | Research Paper |
What is llama-3.2-Korean-Bllossom-3B?
llama-3.2-Korean-Bllossom-3B is a specialized bilingual language model that enhances the original LLaMA 3.2 architecture with robust Korean language capabilities. The model underwent comprehensive full-tuning using 150GB of refined Korean data while maintaining its English proficiency.
Implementation Details
The model employs a transformer-based architecture with BF16 precision and has been carefully optimized through instruction tuning without compromising the base model's capabilities. Notable is its achievement of state-of-the-art performance on LogicKor benchmarks for models under 5B parameters.
- 100% full-tuning with 150GB of curated Korean data
- Precise instruction tuning methodology
- Maintains original English capabilities while adding Korean support
- Achieves 6+ scores on LogicKor benchmarks
- Compatible with text-generation-inference systems
Core Capabilities
- Bilingual text generation in Korean and English
- Complex reasoning and logic tasks
- Instruction following in both languages
- Maintains performance parity across languages
- Commercial usage supported
Frequently Asked Questions
Q: What makes this model unique?
This model stands out for its genuine bilingual capabilities, achieving high performance in both Korean and English without sacrificing quality in either language. It's particularly notable for achieving top scores on LogicKor benchmarks among models under 5B parameters, without specifically targeting benchmark performance.
Q: What are the recommended use cases?
The model is ideal for bilingual applications requiring Korean-English language processing, including content generation, translation assistance, and complex reasoning tasks. Its commercial license makes it suitable for business applications while maintaining research-grade performance.