Sunday, 21st September 2025
llm-openrouter 0.5. New release of my LLM plugin for accessing models made available via OpenRouter. The release notes in full:
- Support for tool calling. Thanks, James Sanford. #43
- Support for reasoning options, for example
llm -m openrouter/openai/gpt-5 'prove dogs exist' -o reasoning_effort medium
. #45
Tool calling is a really big deal, as it means you can now use the plugin to try out tools (and build agents, if you like) against any of the 179 tool-enabled models on that platform:
llm install llm-openrouter
llm keys set openrouter
# Paste key here
llm models --tools | grep 'OpenRouter:' | wc -l
# Outputs 179
Quite a few of the models hosted on OpenRouter can be accessed for free. Here's a tool-usage example using the llm-tools-datasette plugin against the new Grok 4 Fast model:
llm install llm-tools-datasette
llm -m openrouter/x-ai/grok-4-fast:free -T 'Datasette("https://datasette.io/content")' 'Count available plugins'
Outputs:
There are 154 available plugins.
The output of llm logs -cu
shows the tool calls and SQL queries it executed to get that result.
Locally AI. Handy new iOS app by Adrien Grondin for running local LLMs on your phone. It just added support for the new iOS 26 Apple Foundation model, so you can install this app and instantly start a conversation with that model without any additional download.
The app can also run a variety of other models using MLX, including members of the Gemma, Llama 3.2, and and Qwen families.