KNOWLEDGE is POWER / REAL NEWS is KEY
New York: Monday, April 14, 2025
© 2025 U-S-NEWS.COM
Online Readers: 343 (random number)
New York: Monday, April 14, 2025
Online: 305 (random number)
Join our "Free Speech Social Platform ONGO247.COM" Click Here
Science & tech: 'catastrophic overtraining' could harm large language ai

SCIENCE & TECH: ‘Catastrophic overtraining’ could harm large language AI models that are trained on more data for the sake of training

🔴 Website 👉 https://u-s-news.com/
Telegram 👉 https://t.me/usnewscom_channel



  • Researchers from top US universities warn extending pre-training can be detrimental to performance
  • Too much pre-training can deliver worse performance due to something akin to the butterfly effect
  • The more they are pre-trained, the more they become sensitive to small changes that could disrupt the end result

Researchers from Carnegie Mellon, Stanford, Harvard, and Princeton are challenging one of AI development’s accepted core beliefs – that the more pre-training data the better the performance.

As reported by HPCwire, a new paper discuses the concept of “catastrophic overtraining,” whereby extended pre-training can harm a model’s performance after fine-tuning.



Source link



OnGo247
New 100% Free
Social Platform
ONGO247.COM
Give it a spin!
Sign Up Today
OnGo247
New 100% Free
Social Platform
ONGO247.COM
Give it a spin!
Sign Up Today