Nvidia has introduced the Orin Nano, a palm-sized computer designed for local artificial intelligence processing. The device is priced at $249 and offers significant improvements over its predecessor, with double the speed and efficiency at half the cost.
The Orin Nano can process up to 70% more computational tasks than its predecessor and draws just 25 watts of power. It’s ideal for hobbyists and developers working on robot applications, allowing them to run sophisticated AI models without relying on cloud connectivity.
Nvidia CEO Jensen Huang demonstrated the device in a brief YouTube video, showcasing its capabilities in processing almost 70 trillion operations per second. The company plans to enable robots with this technology, which can also power large language models like Meta’s Llama.
The Orin Nano is designed as a portable brain that can be plugged into other hardware to enhance AI functionality. This device offers a cost-effective solution for applications requiring guaranteed uptime and minimal latency, such as warehouse robots.
However, it’s essential to note that the Orin Nano is not intended to replace high-end GPUs like Nvidia’s high-end models, which are priced tens of thousands of dollars. Nevertheless, with local AI processing becoming increasingly affordable, startups can now focus on intellectual property development rather than simply shoving AI models into smart devices.
Source: https://gizmodo.com/nvidias-new-250-jetson-computer-lets-hobbyists-play-around-with-ai-locally-2000539783