浏览代码

Fix for system prompt setting

1) Ollama supports sending the system prompt as a parameter, not as an option. (See https://github.com/ollama/ollama/blob/main/docs/api.md#request-8) However, it is in the options dictionary and needs moved to the payload dictionary.
2) After moving the system parameter from ollama_options to ollama_payload, delete it from ollama_options. This is to prevent Ollama throwing a warning about invalid options.
ferret99gt 2 月之前
父节点
当前提交
57b01cf8fb
共有 1 个文件被更改,包括 6 次插入1 次删除
  1. 6 1
      backend/open_webui/utils/payload.py

+ 6 - 1
backend/open_webui/utils/payload.py

@@ -182,7 +182,12 @@ def convert_payload_openai_to_ollama(openai_payload: dict) -> dict:
         # Re-Mapping OpenAI's `max_tokens` -> Ollama's `num_predict`
         if "max_tokens" in ollama_options:
             ollama_options["num_predict"] = ollama_options["max_tokens"] 
-            del ollama_options["max_tokens"] # To prevent Ollama warning of invalid option provided        
+            del ollama_options["max_tokens"] # To prevent Ollama warning of invalid option provided
+
+        # Ollama lacks a "system" prompt option. It has to be provided as a direct parameter, so we copy it down.
+        if "system" in ollama_options:
+            ollama_payload["system"] = ollama_options["system"] 
+            del ollama_options["system"] # To prevent Ollama warning of invalid option provided
 
     # Add options to payload if any have been set
     if ollama_options: