Hugging Face has unveiled two tiny AI models, SmolVLM-256M and SmolVLM-500M, designed to run efficiently on devices with limited resources like laptops with under 1GB of RAM. These compact models, measuring in at 256 million and 500 million parameters respectively, are ideal for processing tasks such as analyzing images, short videos, PDFs, and charts. Trained using datasets from The Cauldron and Docmatix, both created by Hugging Face’s M4 team, these models outperform a larger model, Idefics 80B, on benchmarks like AI2D, which tests their ability to analyze grade-school-level science diagrams.
SmolVLM-256M and SmolVLM-500M are available for web and download under an Apache 2.0 license from Hugging Face. However, while small models may be cost-effective and versatile, they can sometimes struggle with complex reasoning tasks, as noted in a recent study by Google DeepMind, Microsoft Research, and Mila in Quebec. This research suggests that smaller models might rely too heavily on surface-level patterns rather than broader contextual understanding.
TechCrunch readers interested in AI news can sign up for the newsletter to stay updated.
Source: https://techcrunch.com/2025/01/23/hugging-face-claims-its-new-ai-models-are-the-smallest-of-their-kind