The Control UI cron model field suggestions were built using only the
model ID (e.g. "gpt-oss:120b-cloud") without the provider prefix
(e.g. "ollama/"). When a user selected such a suggestion, the server
would fall back to the session's default provider (typically "anthropic"),
producing an error like:
Failed to set model: model not allowed: anthropic/gpt-oss:120b-cloud
Fix loadCronModelSuggestions to include the provider prefix in each
suggestion so the submitted value is fully qualified (e.g.
"ollama/gpt-oss:120b-cloud"), matching the same format used by the
chat session model picker.
Fixesopenclaw/openclaw#51306