gemma-3-1b-it-qat-q4_0-GGUF
tetf
gemma-3-1b-it-qat-q4_0-GGUF is a text-generation model from tetf on Hugging Face, with 37K downloads.
What is gemma-3-1b-it-qat-q4_0-GGUF?
gemma-3-1b-it-qat-q4_0-GGUF is an open-weight text-generation model released by tetf on Hugging Face. It has been downloaded 37K times and received 0 likes since release.
Specifications
- Developer: tetf
- Library: —
- Pipeline tag: text-generation
- License: gemma
- Base model: google/gemma-3-1b-it
- Downloads: 37K
- Likes: 0
- Created: 2025-04-11
- Last modified: —
- Hugging Face: tetf/gemma-3-1b-it-qat-q4_0-GGUF
Use gemma-3-1b-it-qat-q4_0-GGUF with PromptLayer
PromptLayer lets teams manage, evaluate, and observe prompts that run on gemma-3-1b-it-qat-q4_0-GGUF alongside every other model in their stack. Version prompts, run evals across models, and ship safe rollouts from the same dashboard.
Ready to try it yourself? Sign up for PromptLayer and start managing your prompts in minutes.