Using the AI writing assistant
Inline AI generation, tone rewrites, and SEO suggestions — powered by Google Gemini, streamed directly into the editor.
The VeloCMS AI writing assistant is not a bolted-on widget — it lives inside the TipTap editor and streams generations directly into your document. It's powered by Google Gemini 2.0 Flash behind a secure server-side proxy, so your API key is never exposed, and the text appears token-by-token just like you'd see in ChatGPT.
Quick start — generating your first AI paragraph
Open any post, move to a blank line, and type /ai-continue. A floating text input appears above your cursor. Describe what you want — something like 'explain the benefits of headless CMS for non-technical blog owners in a friendly tone' — and press Enter. Gemini processes the request and streams the response directly into your editor at that cursor position. The text is immediately editable; you don't need to accept or reject it in a modal. Just keep writing from where the AI left off, or delete the bits you don't like and try again.
Invoking the assistant from selected text
Select any text in the editor and the floating toolbar appears above your selection. The AI button (sparkle icon) gives you context-aware commands: Rewrite turns your selection into a different version, Summarize collapses it into a TL;DR, and Improve SEO rewrites the selection with keywords woven in naturally. The AI sees your selected text as context, so it doesn't rewrite in a vacuum — it rewrites to stay consistent with what you've already written.
Built-in prompts
- Continue writing — extends the paragraph you are in based on context
- Rewrite — rephrases selected text in a different tone
- Summarize — collapses a section into a TL;DR
- Expand — turns bullet points into prose
- Suggest title — generates 5 title variants from the first paragraph
- Improve SEO — rewrites with target keywords woven in naturally
Streaming, cancellation, and cost
Streaming is real SSE — if the generation is going in the wrong direction, hit Escape and the request is cancelled on the server immediately. You're not waiting for a full response to discard. We only count tokens actually generated (not the full potential response) against your monthly limit. Gemini 2.0 Flash is priced cheaply enough that we include generous monthly limits in all paid plans — for most bloggers, you'd have to be running AI nonstop to hit the ceiling.
Bringing your own API key (BYOK)
If you're on Business or Agency and want to use your own Gemini API key — or switch to OpenAI, Anthropic, or a custom provider like Groq or local Ollama — you can paste your key in Settings → AI. Keys are encrypted at rest with AES-256 before being stored. The platform never sees your plaintext key. If you enter a BYOK key, all AI calls from your blog route through your key and count against your own API quota, not VeloCMS's shared pool.
Common pitfalls
The AI assistant works best when your post already has some content to anchor on. Typing /ai-continue on a completely blank document gives Gemini nothing to work with — the result tends to be generic. Write at least a title and a rough opening paragraph first, then use the assistant to extend. Another thing: the 'Improve SEO' command rewrites your text, not just annotates it. Make sure you've saved a draft before running it — or just review the output carefully before publishing, since the SEO rewrite can sometimes change your voice significantly. Finally, don't use the AI assistant for factual research without verifying the output. Gemini is a language model, not a fact database.
Related articles
- Using slash commands in the editor — the full slash command reference including /ai-* commands
- Understanding SEO and LLM scores — how the AI's output affects your post's scoring
- BYOK AI keys — connecting your own Gemini, OpenAI, or Anthropic key
The AI assistant is a helper, not a replacement for your judgment. Always review generated text — factual claims need human verification.