Pārlūkot izejas kodu

omit prompt and generate settings from final response

Michael Yang 1 gadu atpakaļ
vecāks
revīzija
44869c59d6
1 mainītis faili ar 0 papildinājumiem un 2 dzēšanām
  1. 0 2
      llm/ext_server/server.cpp

+ 0 - 2
llm/ext_server/server.cpp

@@ -1186,8 +1186,6 @@ struct llama_server_context
             {"model",               params.model_alias},
             {"tokens_predicted",    slot.n_decoded},
             {"tokens_evaluated",    slot.n_prompt_tokens},
-            {"generation_settings", get_formated_generation(slot)},
-            {"prompt",              slot.prompt},
             {"truncated",           slot.truncated},
             {"stopped_eos",         slot.stopped_eos},
             {"stopped_word",        slot.stopped_word},