A dockable side-panel chat assistant that lets you ask an LLM questions about your flashcards while reviewing.
!! Requires either an OpenRouter account with credits, a local Ollama installation, or a free Google Gemini API key — I suggest asking your preferred AI chatbot how to set this up !!
Features
Streams responses in real-time in a side panel during review Works with any note type — card fields are extracted automatically Supports OpenRouter (hundreds of models), local Ollama, and Google Gemini (free-tier models available) Each provider remembers its own selected model independently Markdown rendering (code blocks, lists, headings, etc.) Customisable system prompt, temperature, max tokens, font size, and panel width Context-aware — sends the current card’s content to the LLM automatically Collapsible panel to stay out of your way
Setup
Install the add-on and restart Anki Click the ⚙ gear icon in the panel (or Tools → Add-ons → Config) Choose OpenRouter, Ollama, or Gemini as your provider For OpenRouter: paste your API key from https://openrouter.ai For Ollama: make sure Ollama is running locally For Gemini: paste your API key from https://aistudio.google.com/apikey Click “Refresh” to load models, then “Test Connection” to verify
Usage Start reviewing cards — the panel appears on the right. Type a question and press Enter. The chat resets automatically when you move to the next card.
Source Code https://github.com/miniminkus/anki-llm-chat
Liên kết hỗ trợ
Reviews (1)
👍 2026-02-26
Looks amazing! Can you please add Gemini Gemma model API support (direct, not via openrouter)? It’s free without any top up.