weblab-10b-instruction-sft
Property | Value |
---|---|
Parameter Count | 10 billion |
Model Type | GPT-NeoX |
Architecture | 36-layer, 4864-hidden-size transformer |
License | cc-by-nc-4.0 |
What is weblab-10b-instruction-sft?
weblab-10b-instruction-sft is a sophisticated Japanese-centric multilingual language model built on the GPT-NeoX architecture. This model represents a significant advancement in multilingual AI, trained on approximately 600B tokens from Japanese C4 and The Pile datasets, followed by instruction-tuning on various datasets including Alpaca and Flan.
Implementation Details
The model features a robust architecture with 36 transformer layers and a 4864-dimensional hidden size. It underwent comprehensive pre-training followed by instruction-supervised fine-tuning using multiple high-quality datasets in both English and Japanese.
- Pre-trained on 600B tokens from Japanese C4 and The Pile
- Fine-tuned on Alpaca (English and Japanese), Flan 2021, Flan CoT, and Flan Dialog
- Implements PyTorch and supports text-generation-inference
Core Capabilities
- Achieves 59.11% average accuracy on JGLUE 8-task benchmark
- Excels in Japanese language tasks with 78.78% accuracy on JGLUE 4-task benchmark
- Supports both Japanese and English language processing
- Optimized for instruction-following tasks
Frequently Asked Questions
Q: What makes this model unique?
This model stands out for its specialized focus on Japanese language processing while maintaining multilingual capabilities. Its instruction fine-tuning and impressive performance on JGLUE benchmarks make it particularly suitable for Japanese-centric applications.
Q: What are the recommended use cases?
The model is ideal for Japanese language processing tasks, including question answering, natural language inference, and text generation. It's particularly well-suited for applications requiring both Japanese and English language understanding.