Prithvi-WxC-1.0-2300M

Maintained By
ibm-nasa-geospatial

Prithvi-WxC-1.0-2300M

PropertyValue
Parameter Count2.3 Billion
Training DataMERRA-2 (160 variables)
PaperarXiv:2409.13598
OrganizationNASA & IBM Collaboration

What is Prithvi-WxC-1.0-2300M?

Prithvi-WxC-1.0-2300M is a groundbreaking AI foundation model developed through a collaboration between NASA and IBM, specifically designed for weather and climate applications. This model represents a significant advancement in environmental science, trained on 160 different variables from MERRA-2 data with 2.3 billion parameters.

Implementation Details

The model is trained with a 50% masking ratio and can handle variable time deltas between input timestamps. It processes data from two timestamps as input and generates predictions for a single, possibly future, timestamp as output. During pretraining, input deltas were chosen from [-3, -6, -9, -12] hours with forecast lead times of [0, 6, 12, 24] hours.

  • Dual-objective training: Forecasting and masked reconstruction
  • Capable of reconstructing atmospheric states from partial information
  • Supports variable time-delta inputs and forecast lead times
  • Optimized for generic use cases beyond just forecasting

Core Capabilities

  • Climate downscaling (part of IBM granite family)
  • Gravity wave parameterization
  • Zero-shot reconstruction
  • Atmospheric state prediction
  • Future timestamp generation

Frequently Asked Questions

Q: What makes this model unique?

The model's ability to handle both forecasting and masked reconstruction tasks, combined with its extensive parameter count and training on comprehensive MERRA-2 data, makes it particularly powerful for weather and climate applications. It's also notable for being an open science initiative between NASA and IBM.

Q: What are the recommended use cases?

This version (prithvi.wxc.2300m.v1) is recommended for generic use cases that don't specifically focus on forecasting. For pure forecasting applications, the rollout version (prithvi.wxc.rollout.2300m.v1) is recommended instead.

🍰 Interesting in building your own agents?
PromptLayer provides Huggingface integration tools to manage and monitor prompts with your whole team. Get started here.