Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Guide
Tom’s Guide
Technology
Nigel Powell

Ollama just made it easier to use AI on your laptop — with no internet required

Llama cartoon image.

The free open-source model market just got a whole lot more interesting with the announcement that Ollama can now run HuggingFace GGUF models in an instant.

The GGUF (GPT-Generated Unified Format) model format is a highly optimized file design that was created by the open-source AI community to allow for use on modest-sized consumer hardware.

Typical AI models demand large computers with powerful processors and extreme amounts of memory. By compressing a model into a single GGUF file, it’s easy to download over the internet and can be run on just about any decent home computer. More importantly, they can also be installed and run by non-technical users.

Previously, models had to first be made available through the Ollama library to run and download on your laptop. This makes the whole process easier and HuggingFace says it will work to simplify things even further.

Why is the GGUF format so important?

It's probably fair to say that the GGUF format has done more to increase the popularity and availability of open-source AI than just about any other recent development, bar the release of the LlaMA family of models from Meta.

There are currently over 500 GGUF model files stored on HuggingFace to choose from, covering anything from image generation to multilingual chat models.

To make use of the format, users generally have to download their choice of GGUF model, and follow their software client install instructions.

However, this new option provided by HuggingFace (HF) goes one step further — as long as you already have Ollama installed on your machine.

How to use GGUF models with Ollama?

All you have to do is find your desired model on HuggingFace, click on the Use This Model button on the top right of the page, and select Ollama. This will pop up a window with the URL address of the model for copying.

On Windows go to the search bar, enter in ‘cmd’ and hit the enter key. When the terminal window appears, paste in the URL you just copied (ctrl-V)and press enter again. At this point, Ollama will automatically download the model ready for use. Quick, easy and painless. The process on a Mac is more or less the same, just replace cmd with Terminal.

It should be noted that these GGUF files can also be run using a growing variety of user clients. Some of the most popular include Jan, LMStudio and Msty. The format is also supported by the OpenWebUI chat application.

Final thoughts

(Image credit: Ollama)

The world of free open-source AI models continues to advance at a breakneck pace, in part fueled by developments like this. More and more people are finding that they can take advantage of the power and flexibility of small locally run AI, without needing a computer science degree. And the performance is getting better all the time.

For example, there’s a growing community of users who use local GGUF models for specialist applications, such as assisting with their home business or helping with specific language translation tasks.

It’s good to see that open source continues to thrive, even when up against venture capital-backed megacorps.

More from Tom's Guide

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.