mnasnet_100.rmsp_in1k

Maintained By
timm

MNasNet 100 RMSProp ImageNet-1k

PropertyValue
Parameter Count4.42M
Model TypeImage Classification
LicenseApache-2.0
PaperMnasNet: Platform-Aware Neural Architecture Search for Mobile
DatasetImageNet-1k

What is mnasnet_100.rmsp_in1k?

MNasNet is a mobile-optimized neural network architecture developed through platform-aware neural architecture search. This specific variant has been trained on ImageNet-1k using RMSProp optimization, featuring 4.42M parameters and designed for efficient mobile deployment.

Implementation Details

The model utilizes a specialized training recipe that includes RMSProp optimizer with TensorFlow 1.0 behavior, implementing EMA weight averaging and step-based learning rate scheduling with warmup. The architecture processes 224x224 images and requires only 0.3 GMACs for inference.

  • Employs RandomErasing and mixup augmentation techniques
  • Features dropout for regularization
  • Implements standard random-resize-crop augmentation
  • Optimized for mobile deployment with 5.5M activations

Core Capabilities

  • Image classification with ImageNet-1k classes
  • Feature map extraction with multiple resolution outputs
  • Image embedding generation
  • Mobile-optimized inference

Frequently Asked Questions

Q: What makes this model unique?

The model stands out for its platform-aware architecture design, specifically optimized for mobile deployment while maintaining competitive accuracy. Its RMSProp training recipe and efficient parameter utilization make it particularly suitable for resource-constrained environments.

Q: What are the recommended use cases?

This model is ideal for mobile and edge device deployment where efficient image classification is required. It's particularly suitable for real-time applications needing reasonable accuracy while maintaining minimal computational overhead.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.