t5-small-finetuned-spanish-to-quechua
Property | Value |
---|---|
Base Model | T5-small |
Training Data | 102,747 sentences |
BLEU Score | 2.9691 |
Model Hub | Hugging Face |
What is t5-small-finetuned-spanish-to-quechua?
This is a specialized machine translation model built on the T5-small architecture, specifically designed to translate Spanish text into Ayacucho Quechua. The model was developed during the SomosNLP Hackathon 2022 by Sara Benel and Jose Vílchez, representing a significant step in low-resource language translation technology.
Implementation Details
The model was trained for 46 epochs using a comprehensive dataset of 102,747 sentences, with validation performed on 12,844 sentences and testing on 12,843 sentences. The training process achieved an evaluation loss of 1.2064 and a BLEU score of 2.9691.
- Built on the T5-small transformer architecture
- Specialized for Ayacucho Quechua dialect
- Extensive training on biblical texts
- Easy integration with Hugging Face Transformers library
Core Capabilities
- Spanish to Quechua translation
- Optimized for religious and formal text translation
- Support for the Ayacucho dialect of Quechua
- Batch processing capabilities
Frequently Asked Questions
Q: What makes this model unique?
This model addresses the specific challenge of translating between Spanish and Quechua, focusing on the Ayacucho dialect. It's one of the few available models for this language pair and has been trained on a substantial dataset of over 100,000 sentences.
Q: What are the recommended use cases?
The model performs best with religious and formal texts, particularly biblical content, due to the nature of its training data. It's suitable for organizations working with Quechua-speaking communities, particularly in the Ayacucho region, and for applications requiring Spanish to Quechua translation capabilities.