medium severityLiteLLM Router, Proxy (model/provider switching)

Conversation history/context appears lost or invalid after switching LiteLLM model/provider mid-conversation (e.g. Gemini 2.5-flash to 3-pro); model ignores prior messages, asks to repeat info, or errors on missing fields like thought_signature.[LiteLLM Gemini 3 Blog](https://docs.litellm.ai/blog/gemini_3)

Root cause

When switching providers/models in LiteLLM (proxy/router), conversation history loses critical provider-specific fields (e.g. Gemini thought_signature) if not appending full response.message, causing new provider to reject/misinterpret context or apply incompatible defaults/formats.[LiteLLM Gemini 3 Blog](https://docs.litellm.ai/blog/gemini_3)

litellmrouterfallbackcontext_windowgeminithought_signaturehistory

Citations