Set up ollama by following the instructions at https://ollama.ai/

Run ollama, allowing access from Amplenote: OLLAMA_ORIGINS=https://plugins.amplenote.com ollama serve

Press cmd+o on macOS, or ctrl+o on Windows/Linux, to bring up the jump to note dialog

Type "ollama" and select the ollama plugin

Ask away:



The following settings are available:

host defaults to localhost:11434 - change this if you are running ollama in a non-default way.

model the name of the model to use, defaults to llama2. For more models, visit https://ollama.ai/library