Jasmine-350M

Jasmine-350M

UBC-NLP

Jasmine-350M is a 350M-parameter Arabic language model designed for few-shot learning, part of the JASMINE suite of models trained on 235GB of Arabic text data.

PropertyValue
Model TypeText Generation, Transformers
FrameworkPyTorch
PaperEMNLP 2023 Paper
Training Data235GB Arabic Text

What is Jasmine-350M?

Jasmine-350M is part of the JASMINE suite of Arabic GPT models, specifically designed for few-shot learning tasks. This model, containing 350 million parameters, represents a balanced compromise between computational efficiency and performance. It was introduced in a paper presented at EMNLP 2023 and is particularly notable for its specialized focus on Arabic language processing.

Implementation Details

The model is implemented using PyTorch and follows the GPT-Neo architecture. It was trained on a massive 235GB dataset of Arabic text, making it one of the most comprehensive Arabic language models available.

  • Architecture based on GPT-Neo framework
  • Optimized for few-shot learning scenarios
  • Trained on diverse Arabic text corpus
  • Implements transformer-based architecture

Core Capabilities

  • Arabic text generation and completion
  • Few-shot learning for various NLP tasks
  • Context-aware text processing
  • Natural language understanding in Arabic

Frequently Asked Questions

Q: What makes this model unique?

Jasmine-350M stands out for its specialized focus on Arabic language processing and its optimization for few-shot learning, making it particularly valuable for applications where limited training data is available.

Q: What are the recommended use cases?

The model is ideal for Arabic text generation, completion tasks, and applications requiring few-shot learning capabilities in Arabic NLP contexts. It's particularly suitable for research and production environments where Arabic language processing is crucial.

Related Models

Socials
PromptLayer
Company
All services online
Location IconPromptLayer is located in the heart of New York City
PromptLayer © 2026