Hi,
I wanted to run some Large Language Models locally. Something like Private GPT or Medium Article on my local Apple Silicon to enhance my privacy but also get some additional help.
Does anyone have recommendations or guides I could follow?
Thank you very much.
You must log in or # to comment.
On macOS I’ve been using Ollama. It’s very easy to setup, can run as a service and expose an API.
You can talk to it directly from the CLI (
ollama run
) or via applications and plugins (like https://continue.dev ) that consume the API.It can run on Linux but I haven’t personally tried it.