ソースを参照

fix error when use farui-plus model (#7316)

Co-authored-by: 雪风 <xuefeng@shifaedu.cn>
噢哎哟喂 8 ヶ月 前
コミット
6fdbc7dbf3
1 ファイル変更2 行追加0 行削除
  1. 2 0
      api/core/model_runtime/model_providers/tongyi/llm/llm.py

+ 2 - 0
api/core/model_runtime/model_providers/tongyi/llm/llm.py

@@ -159,6 +159,8 @@ You should also complete the text started with ``` but not tell ``` directly.
         """
         if model in ['qwen-turbo-chat', 'qwen-plus-chat']:
             model = model.replace('-chat', '')
+        if model == 'farui-plus':
+            model = 'qwen-farui-plus'
 
         if model in self.tokenizers:
             tokenizer = self.tokenizers[model]