DistilBERT Base Uncased Fine-tuned IMDB
Property | Value |
---|---|
License | Apache 2.0 |
Training Framework | PyTorch 1.9.1 |
Transformers Version | 4.12.0.dev0 |
What is distilbert-base-uncased-finetuned-imdb?
This model is a fine-tuned version of DistilBERT base uncased specifically optimized for sentiment analysis on the IMDB dataset. It represents a lightweight and efficient variant of BERT, maintaining strong performance while reducing computational requirements.
Implementation Details
The model was trained using Native AMP mixed precision training with the Adam optimizer, utilizing a linear learning rate scheduler. Training was conducted over 3 epochs with a learning rate of 2e-05 and batch sizes of 64 for both training and evaluation.
- Training Loss: Improved from 2.708 to 2.5385 across epochs
- Validation Loss: Achieved final score of 2.4451
- Optimizer Configuration: Adam with betas=(0.9,0.999) and epsilon=1e-08
Core Capabilities
- Sentiment Analysis on Movie Reviews
- Text Classification
- Transfer Learning from DistilBERT
Frequently Asked Questions
Q: What makes this model unique?
This model combines the efficiency of DistilBERT with specific optimization for movie review sentiment analysis, making it particularly suitable for IMDB-style content analysis while maintaining computational efficiency.
Q: What are the recommended use cases?
The model is best suited for sentiment analysis tasks in movie reviews and similar content, particularly when working with English language text in an uncased format. It's ideal for applications requiring efficient processing while maintaining good accuracy.