Granite 8B Code Instruct GGUF
Property | Value |
---|---|
Parameter Count | 8.05B |
License | Apache 2.0 |
Paper | Granite Code Models Paper |
Developer | IBM Research |
Release Date | May 6th, 2024 |
What is granite-8b-code-instruct-GGUF?
Granite-8B-Code-Instruct is an advanced code-generation model developed by IBM Research, specifically optimized for programming tasks. It's a GGUF-quantized version of the original model, making it more accessible for local deployment while maintaining high performance across multiple programming languages.
Implementation Details
The model is available in multiple quantization formats, ranging from 3.06GB (Q2_K) to 16.12GB (f16), allowing users to balance between performance and resource requirements. It utilizes a specialized prompt template and has been fine-tuned on permissively licensed instruction data.
- Multiple quantization options for different hardware capabilities
- Comprehensive support for Python, JavaScript, Java, Go, C++, and Rust
- Trained on 8 diverse datasets including code-specific and mathematical instruction sets
Core Capabilities
- Strong code synthesis performance (57.9% pass@1 for Python)
- Code explanation capabilities across multiple languages
- Bug fixing functionality with up to 48.2% accuracy in Java
- Mathematical reasoning and problem-solving abilities
Frequently Asked Questions
Q: What makes this model unique?
The model stands out for its balanced performance across multiple programming languages and its availability in various quantization formats, making it accessible for different hardware configurations. Its instruction-following capabilities are enhanced through training on diverse, permissively licensed datasets.
Q: What are the recommended use cases?
The model excels in code generation, explanation, and bug fixing across multiple programming languages. It's particularly effective for Python and Java development, with pass@1 rates above 50% in these languages. It's suitable for both educational purposes and professional development workflows.