소스 검색

fix: fix tongyi models blocking mode with incremental_output=stream (#13620)

Yingchun Lai 2 달 전
부모
커밋
a3d3e30e3a
1개의 변경된 파일1개의 추가작업 그리고 1개의 파일을 삭제
  1. 1 1
      api/core/model_runtime/model_providers/tongyi/llm/llm.py

+ 1 - 1
api/core/model_runtime/model_providers/tongyi/llm/llm.py

@@ -197,7 +197,7 @@ class TongyiLargeLanguageModel(LargeLanguageModel):
         else:
             # nothing different between chat model and completion model in tongyi
             params["messages"] = self._convert_prompt_messages_to_tongyi_messages(prompt_messages)
-            response = Generation.call(**params, result_format="message", stream=stream, incremental_output=True)
+            response = Generation.call(**params, result_format="message", stream=stream, incremental_output=stream)
         if stream:
             return self._handle_generate_stream_response(model, credentials, response, prompt_messages)