gpt2-horoscopes

Maintained By
shahp7575

gpt2-horoscopes

PropertyValue
Model BaseGPT-2
Training Data~12k horoscopes
Categoriesgeneral, career, love, wellness, birthday
SourceHoroscopes.com

What is gpt2-horoscopes?

gpt2-horoscopes is a specialized language model built on GPT-2 architecture, fine-tuned specifically for generating horoscope predictions across five distinct categories. The model leverages a dataset of approximately 12,000 horoscopes scraped from Horoscopes.com, enabling it to generate contextually relevant predictions based on specified categories.

Implementation Details

The model implements a category-based generation system using special tokens for different horoscope types. It was trained for 5 epochs with a learning rate of 5e-4, warmup steps of 1e2, and epsilon of 1e-8. The sequence length was set to 300 tokens, achieving a final loss of 2.77.

  • Specialized tokenization with category markers (<|category|>)
  • Temperature and top-k sampling for generation diversity
  • Maximum sequence length of 300 tokens
  • Supports five distinct horoscope categories

Core Capabilities

  • Category-specific horoscope generation
  • Flexible text generation parameters
  • Integration with HuggingFace pipeline API
  • Custom prompt formatting with category specification

Frequently Asked Questions

Q: What makes this model unique?

This model's uniqueness lies in its specialized fine-tuning for horoscope generation with category-specific outputs, utilizing a structured approach with special tokens for different horoscope types.

Q: What are the recommended use cases?

The model is designed for educational and learning purposes, specifically for generating category-based horoscope text. It's important to note that it doesn't attempt to provide actual astrological predictions and should be used primarily for experimental or educational contexts.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.