GPT-5 Hype Overshadowed by Subpar Performance Behind the Scenes

GPT-5, the latest AI model update from OpenAI, was touted as a significant improvement over its predecessor, GPT-4. However, recent reports reveal that the upgrades in GPT-5 are mostly incremental and not as impressive as expected.

According to insiders, the major problem with GPT-5 is the dwindling supply of high-quality web data to train the model on. Additionally, OpenAI researchers struggled to get their full GPT-5 model to produce consistent results, particularly when it comes to basic math tasks.

The first attempt to create GPT-5, codenamed Orion, was not enough to justify the hype surrounding the new model. Insiders reported that the early stages of GPT-4 were more impressive, and some questioned whether GPT-5 truly deserves its name.

Veteran AI expert Gary Marcus noted that “pure scaling is not getting us to [digital superintelligence].” This sentiment is echoed by Yunyu Lin, a researcher who discovered that Large Language Models, including OpenAI’s o3 and o4 mini, degrade over time.

While the hype surrounding GPT-5 may be dying down, OpenAI remains on track to beat its income projection for 2025 and secure significant funding. The company’s income has skyrocketed in the past year, and it is expected to go public in 2026.

In conclusion, while GPT-5 shows promise, its performance behind the scenes is far from spectacular. As Altman himself noted, “returns are diminishing.”

Source: https://mashable.com/article/gpt5-coming