electra-base-de-squad2

Maintained By
deutsche-telekom

electra-base-de-squad2

PropertyValue
Parameter Count111M
LicenseMIT
LanguageGerman
Training DatadeQuAD2.0 (130k QA pairs)
Best Performance70.97% Exact Match, 76.18% F1

What is electra-base-de-squad2?

electra-base-de-squad2 is a German language question-answering model developed by Deutsche Telekom, built upon the ELECTRA architecture. It's specifically fine-tuned on deQuAD, a comprehensive German Question Answering dataset containing 130,000 training and 11,000 test QA pairs. This model represents a significant advancement in German language understanding and question-answering capabilities.

Implementation Details

The model is built on the electra-base-german-uncased architecture and trained using 8 V100 GPUs. It demonstrates superior performance compared to other German language models, particularly in exact match accuracy and F1 scores for both answerable and unanswerable questions.

  • Base Architecture: ELECTRA German Uncased
  • Training Dataset: deQuAD2.0 (~42MB training set)
  • Evaluation Dataset: deQuAD2.0 test set (~4MB)
  • Performance Metrics: 70.97% Exact Match, 76.18% F1 Score

Core Capabilities

  • Advanced German text comprehension and analysis
  • Accurate question-answering for both answerable and unanswerable queries
  • Efficient context processing and answer extraction
  • Easy integration with the Hugging Face transformers pipeline

Frequently Asked Questions

Q: What makes this model unique?

This model stands out for its superior performance in German question-answering tasks, outperforming other German language models like BERT variants. It achieves particularly strong results in both answerable (67.73% EM) and unanswerable (74.29% EM) questions.

Q: What are the recommended use cases?

The model is ideal for German language applications requiring precise question-answering capabilities, including automated customer service, information extraction from documents, and intelligent search systems. It's particularly effective when integrated into applications using the Hugging Face transformers pipeline.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.