The latest advancements in artificial intelligence (AI) are making machine learning enthusiasts eager to experiment, but the high cost of GPUs is a major roadblock. Fortunately, an open-source software package called exo is available to simplify the process of running large language models like DeepSeek R1 671B on old hardware.
With exo, you can distribute AI workloads across multiple devices, such as laptops, desktops, Raspberry Pis, and smartphones, by leveraging their memory and computing power. This approach eliminates the need for expensive GPUs, making it possible to run the latest AI algorithms locally on your own hardware.
To use exo, all devices must be connected to the same local network, which automatically discovers available devices and eliminates the need for complex configuration. A dynamic model partitioning strategy then breaks up models into pieces according to device capabilities.
While running DeepSeek R1 671B requires significant memory (approximately 1,342 GB), other applications may require less, making pooling memory a viable option. With exo, you can pool together enough memory from multiple devices to achieve the desired level of performance.
Full instructions for installing and running exo are available on GitHub, allowing you to breathe new life into old hardware and experiment with powerful AI tools without breaking the bank.
Source: https://www.hackster.io/news/dust-off-that-old-hardware-and-run-deepseek-r1-on-it-b9f58347de58