谷歌Sumber TerbukaGemma-3:媲美DeepSeek,Daya Komputasi暴降

robot
Pembuatan abstrak sedang berlangsung

Data Golden March 13th, Google (GOOG.O) CEO Sundar Pichai announced last night that the latest multimodal large model GEMMA-3 from Open Source is designed for low cost and high performance. GEMMA-3 has four parameters: 1 billion, 4 billion, 12 billion, and 27 billion. But even with the largest 27 billion parameters, only one H100 is needed for efficient inference, which is at least 10 times less Daya Komputasi than similar models to achieve this effect, and it is currently the strongest small parameter model. According to blind test LMSYS ChatbotArena data, GEMMA-3 is second only to DeepSeek's R1-671B, higher than OpenAI's O3-MINI, Llama3-405B and other well-known models.

Lihat Asli
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Hadiah
  • Komentar
  • Bagikan
Komentar
0/400
Tidak ada komentar
  • Sematkan
Perdagangkan Kripto Di Mana Saja Kapan Saja
qrCode
Pindai untuk mengunduh aplikasi Gate
Komunitas
Bahasa Indonesia
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)