Apple Now Lets You Stack Multiple Devices to Run Large AI Models

At the Apple WWDC 24, the company announced Apple Iintelligence. Undoubtedly, the key focus was privacy and how AI can be used on your phone to actually help you. But Apple left a hidden gem to be discovered by developers.

It turns out that you can use Apple’s own open source machine learning library, MLX, to fuse your Apple devices together to create one giant AI cluster. This way you can locally run large AI models without internet.

A user already shared his experience fusing 2 MacBook, 2 iPhone 15 Pro, and one iPad to run Llama 3 8b instruct 4 bit model:

Connecting a bunch of iPhones, iPads and MacBooks together over a local network to make one big GPU.
Uses Apple’s open source ML library, MLX https://t.co/Ran48fwMNB

— Alex Cheema – e/acc (@ac_crypto) June 13, 2024

But is it beyond Apple-only devices

The developer who tested MLX with multiple Apple devices mentioned that as long as you have a computer with decent computational power, you can use any computer and it does not have to be Apple only.

it can be stacked to run large models with only once condition that all the devices has to be under the same network.

It is made possible because of the MLX is opensource and can easily be installed as a Python package.

This opens a new possibility for for developers and tech users. For example, complex tasks such as large-scale text generation, image synthesis with Stable Diffusion, and speech recognition with models like OpenAI’s Whisper can be executed more efficiently.

Developers can experiment with advanced AI models without the need for expensive cloud services, making AI research and application development more accessible.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments