gemma-3-1b-it-qat-4bit
mlx-community
gemma-3-1b-it-qat-4bit is a text-generation model from mlx-community on Hugging Face, with 75K downloads.
What is gemma-3-1b-it-qat-4bit?
gemma-3-1b-it-qat-4bit is an open-weight text-generation model released by mlx-community on Hugging Face. It has been downloaded 75K times and received 4 likes since release.
Specifications
- Developer: mlx-community
- Library: transformers
- Pipeline tag: text-generation
- License: gemma
- Base model: google/gemma-3-1b-it
- Downloads: 75K
- Likes: 4
- Created: 2025-04-15
- Last modified: —
- Hugging Face: mlx-community/gemma-3-1b-it-qat-4bit
Use gemma-3-1b-it-qat-4bit with PromptLayer
PromptLayer lets teams manage, evaluate, and observe prompts that run on gemma-3-1b-it-qat-4bit alongside every other model in their stack. Version prompts, run evals across models, and ship safe rollouts from the same dashboard.
Ready to try it yourself? Sign up for PromptLayer and start managing your prompts in minutes.