Browse Source

Update and fix the model param of Deepseek (#5329)

Richards Tu 10 tháng trước cách đây
mục cha
commit
c163521b9e

+ 1 - 1
api/core/model_runtime/model_providers/deepseek/llm/deepseek-chat.yaml

@@ -23,7 +23,7 @@ parameter_rules:
     type: int
     default: 4096
     min: 1
-    max: 32000
+    max: 4096
     help:
       zh_Hans: 指定生成结果长度的上限。如果生成结果截断,可以调大该参数。
       en_US: Specifies the upper limit on the length of generated results. If the generated results are truncated, you can increase this parameter.

+ 2 - 2
api/core/model_runtime/model_providers/deepseek/llm/deepseek-coder.yaml

@@ -7,7 +7,7 @@ features:
   - agent-thought
 model_properties:
   mode: chat
-  context_size: 16000
+  context_size: 32000
 parameter_rules:
   - name: temperature
     use_template: temperature
@@ -22,5 +22,5 @@ parameter_rules:
   - name: max_tokens
     use_template: max_tokens
     min: 1
-    max: 32000
+    max: 4096
     default: 1024