Ollama: Easily run LLMs locally on macOS and Linux

Open link in next tab

Ollama

https://ollama.ai/

Get up and running with large language models, locally.

Ollama

If youโ€™re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and thereโ€™s also a Raycast extension for more powerful usage.