OpenClaw 2026.3.12 Release

OpenClaw 2026.3.12 Brings Dashboard Overhaul, Fast Mode, and Plugin Architecture

OpenClaw shipped version 2026.3.12 on March 13, 2026. If you run AI models locally or manage a self-hosted setup, this release has some practical upgrades worth your time.

What's New

Dashboard v2 replaces the old gateway dashboard with something usable. It breaks things into modular views — overview, chat, config, agent, and session tabs — plus a command palette, mobile-friendly bottom tabs, and richer chat tools like slash commands, search, export, and pinned messages.

Fast mode is the other big addition. OpenAI and Anthropic now support session-level fast toggles via /fast, the TUI, Control UI, and ACP. It maps to the providers' own service_tier endpoints for faster responses. Both tiers are verified live.

Plugin architecture for local models is the third piece. Ollama, vLLM, and SGLang now plug into the provider-plugin system instead of being hardcoded. That keeps the core leaner and makes swapping backends less painful.

Security Fixes

Two security items worth noting:

  • Device pairing now uses short-lived bootstrap tokens instead of embedding shared gateway credentials in QR codes or chat payloads. This closes a genuine risk for exposed gateways.
  • Workspace plugins no longer auto-load implicitly. Cloned repositories can't execute plugin code without an explicit trust decision. This came through as GHSA-99qw-6mr3-36qr.

Other Fixes

Kimi Coding tool calls stopped working correctly — they were degrading into XML/plain-text instead of proper tool_use blocks. Fixed.

TUI duplicate assistant messages? Fixed.

Cron proactive delivery was replaying duplicates after restarts. Fixed.

Mattermost block streaming was delivering duplicate messages (one threaded, one top-level). Fixed.

A handful of smaller issues with Telegram model picker persistence, Moonshot CN API auth, and BlueBubbles/iMessage echo deduplication also landed fixes.

The Bigger Picture

This release targets teams running local LLMs and inference servers in production. The plugin path for Ollama, SGLang, and vLLM matters because it makes the core more maintainable and lets you swap backends without wrestling with the main codebase.

The security fixes are overdue — ephemeral tokens and explicit plugin trust decisions are basic requirements for exposed AI gateways.

If you're on an earlier version, 2026.3.12 is worth updating to, especially if you care about the dashboard improvements or need faster model responses.


Published: March 13, 2026
Source: OpenClaw GitHub releases