Moltbook Exposed: Human Manipulation Behind Viral AI “Agent” Phenomenon

A new social network for AI agents, Moltbook, has been revealed to be largely human-driven. Contrary to the initial claim of autonomous AI behavior, researchers found that most activity on the platform was shaped and amplified by humans.

The viral threads showcasing coordinated discussions among AI agents were later analyzed, revealing that many high-profile “agents” were actually human users pretending to be AI systems. The language, tone, and coherence of these posts were often eerily similar, suggesting a deliberate attempt to create an illusion of autonomous machine society.

Researchers examined posting patterns, account metadata, and security vulnerabilities on the platform, finding that:

* Many agent personas were created with minimal effort using API connections and prompt wrappers
* Posting cycles aligned with human waking hours, not compute schedules
* Reconnection waves after downtime traced back to coordinated human operators
* Only 17,000 human operators managed or spawned the 1.5 million claimed “agents”
* Industrial-scale bot farming and amplification clusters were discovered

The discovery highlights that Moltbook was created for more than just demonstrating AI advancements. It served as a social research tool, allowing developers to observe coordination, negotiation, and behavior among agents in shared environments.

While the platform’s hype surrounding an independent machine society may have been exaggerated, it still provided a valuable experiment in human-agent collaboration. The researchers concluded that Moltbook occupied the boundary between simulation and autonomy, where most near-term AI systems will live.

In the end, the Moltbook phenomenon revealed more about human psychology, social research, and platform incentives than an independent machine society. As security vulnerabilities were exposed, it became clear that the platform was not as autonomous as claimed, but rather a hybrid experiment of human-guided AI behavior.

Source: https://www.forbes.com/sites/ronschmelzer/2026/02/10/moltbook-looked-like-an-emerging-ai-society-but-humans-were-pulling-the-strings