Go to PRSS Addons. On the Companion card, click the "Get in PRSS Creator Hub" button.

A new browser window will open. Click "Add to PRSS".

Open Companion by clicking "Open Companion" or the highlighted menu option that will appear.

You will see the Companion page, with two fields: Ollama API and AI Model.

Go to ollama.com and download the app for MacOS. After download, run the app.

Install the command line when prompted by the Ollama installer.

Run "ollama run llama3.2" in your Terminal.

The llama3.2 model is ~2GB, so it might take a bit to download. Once it finishes, you can test prompts right on the console.

Close the Terminal and Quit Ollama from your system's menu bar or task bar.

Enter "http://localhost:11434/v1" in the Ollama API input box.

"llama3.2" will be autofilled in the AI Model box. If you installed a model other than llama3.2, please use that instead.

That's all! You can start Ollama from the Companion "Settings" tab at any time.

Once started, you can use Companion in your posts by right-clicking in the Editor, or by selecting text.