A new study From MIT The largest and most calculatory AI models suggest that the largest AI models can soon provide reduced returns compared to smaller models. Researchers have discovered scaling laws against continuous improvement in model skills that jumping in performance from giant models can become more difficult where skill achievement can enable more than the more models on the more modest hardware in the next decade.
Nil Thompson, a computer scientist and professor of MIT involved in the study, said, “In the next five to ten years, things are very likely to begin to narrow.
In January, such a significantly lowly expensive model is seen in January, as well as a reality check for the AI industry, which is accustomed to burning a lot of calculations.
AI Industry
As things stand, a border model like Openai is much better than a fraction of the calculation from the academic lab at present. Although the MIT cannot hold the prediction of the team, for example, new training methods such as learning reinforcement create amazing new results, they suggest that large AI companies will be lower in the future.
Hans Gundlach, a research scientist at MIT, who led the analysis, became interested in this matter because of the irrational nature of the Cutting Edge models. Together with another MIT research scientist Thompson and Jason Lynch, he mapped the future performance of border models compared to them. Gundalach says that the predicted tendency is especially pronounced for rational models that are now prevalent, which depends more on the extra count during the estimation.
Thompson says that the results show the value of the calculation scaling as well as respect an algorithm. If you are spending a lot of money to train these models, you should spend some of your more skilled algorithms, because it may be very important,” he added.
Today’s AI infrastructure boom (or we should be called “bubble”) – This survey is especially interesting – which shows slight signs of slow.
OpenAI and other US technology companies have signed several hundred billion dollars agreements to build AI infrastructure in the United States. Openi’s President Greg Brockman, “Needed more count in the world” Announced this week Since he announced the partnership between OpenAI and Broadcom for Custom AI Chips.
A growing number of experts are questioning the transparency of these deals. Fairly 60 percent The cost of the data center goes to the GPUs, which quickly underestimate. The partnership between the main players is also present Notifications and opaqueThe
