DeepMind claims its AI called RETRO matches the performance of neural networks 25 times its size, cutting the time and cost of training large language models (Will Douglas Heaven/MIT Technology Review)
Will Douglas Heaven / MIT Technology Review:DeepMind claims its AI called RETRO matches the performance of neural networks 25 times its size, cutting the time and cost of training large language models — RETRO uses an external memory to look up passages of text on the fly, avoiding some of the costs of training a vast neural network
Will Douglas Heaven / MIT Technology Review:
DeepMind claims its AI called RETRO matches the performance of neural networks 25 times its size, cutting the time and cost of training large language models — RETRO uses an external memory to look up passages of text on the fly, avoiding some of the costs of training a vast neural network