My Go-to Llm Tool Just Dropped A Super Simple Mac And Pc App For Local Ai - Why You Should Try It

Trending 1 month ago
Ollama has a caller MacOS/Windows app that makes it moreover easier to chat pinch LLMs
Jack Wallen / Elyse Betters Picaro / ZDNET

ZDNET's cardinal takeaways

  • Ollama AI devs person released a autochthonal GUI for MacOS and Windows.
  • The caller GUI greatly simplifies utilizing AI locally.
  • The app is easy to install, and allows you to propulsion different LLMs.

If you usage AI, location are respective reasons why you would want to activity pinch it locally alternatively of from nan cloud. 

First, it offers overmuch much privacy. When utilizing a Large Language Model (LLM) successful nan cloud, you ne'er cognize if your queries aliases results are being tracked aliases moreover saved by a 3rd party. Also, utilizing an LLM locally saves energy. The amount of power required to usage a cloud-based LLM is increasing and could beryllium a problem successful nan future.

Ergo, locally hosted LLMs.

Also: How to tally DeepSeek AI locally to protect your privateness – 2 easy ways

Ollama is simply a instrumentality that allows you to tally different LLMs. I've been utilizing it for immoderate clip and person recovered it to simplify nan process of downloading and utilizing various models. Although it does require superior strategy resources (you wouldn't want to usage it connected an aging machine), it does tally fast, and allows you to usage different models.

But Ollama by itself has been a command-line-only affair. There are immoderate third-party GUIs (such arsenic Msty, which has been my go-to). Until now, nan developers down Ollama hadn't produced their ain GUI.

That each changed recently, and there's now a straightforward, user-friendly GUI, aptly named Ollama.

Works pinch communal LLMs - but you tin propulsion others

The GUI is reasonably basic, but it's designed truthful that anyone tin jump successful correct distant and commencement utilizing it. There is besides a short database of LLMs that tin easy beryllium pulled from nan LLM drop-down list. Those models are reasonably communal (such arsenic nan Gemma, DeepSeek, and Qwen models). Select 1 of those models, and nan Ollama GUI will propulsion it for you. 

If you want to usage a exemplary not listed, you would person to propulsion it from nan bid statement for illustration so:

ollama propulsion MODEL

Where MODEL is nan sanction of nan exemplary you want.

Also: How I provender my files to a section AI for better, much applicable responses

You tin find a afloat database of disposable models successful nan Ollama Library.

After you've pulled a model, it appears successful nan drop-down to nan correct of nan query bar.

The Ollama app is arsenic easy to usage arsenic immoderate cloud-based AI interface connected nan market, and it's free to usage for MacOS and Windows (sadly, there's nary Linux type of nan GUI).

I've kicked nan tires of nan Ollama app and recovered that, though it doesn't person rather nan characteristic group of Msty, it's easier to usage and fits successful amended pinch nan MacOS aesthetic. The Ollama app besides seems to beryllium a spot faster than Msty (in some opening and responding to queries), which is simply a bully point because section AI tin often beryllium a spot slow (due to a deficiency of strategy resources).

How to instal nan Ollama app connected Mac aliases Windows

You're successful luck, arsenic installing nan Ollama app is arsenic easy arsenic installing immoderate app connected either MacOS aliases Windows. You simply constituent your browser to nan Ollama download page, download nan app for your OS, double-click nan downloaded file, and travel nan directions. For example, connected MacOS, you resistance nan Ollama app icon into nan Applications folder, and you're done.

Using Ollama is arsenic easy: prime nan exemplary you want, fto it download, past query away.

The Ollama app showing nan exemplary prime drop-down.

Pulling an LLM is arsenic easy arsenic selecting it from nan database and letting nan app do its thing.

Jack Wallen/ZDNET

Should you effort nan Ollama app?

If you've been looking for a logic to effort section AI, now is nan cleanable time. 

Also: I tried Sanctum's section AI app, and it's precisely what I needed to support my information private

The Ollama app makes migrating distant from cloud-based AI arsenic easy arsenic it tin get. The app is free to instal and use, arsenic are nan LLMs successful nan Ollama library. Give this a chance, and spot if it doesn't go your go-to AI tool.

Want much stories astir AI? Check retired AI Leaderboard, our play newsletter.

More