LLM Node Fails with "has no attribute 'startswith'" when System Prompt is Set #33639
Replies: 1 comment
-
|
This is a known bug in Dify 1.13.1. The root cause is that Workarounds until a fix is released:
A fix is being worked on in PR #2762 which adds normalization helpers to properly handle To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Self Checks
1. Is this request related to a challenge you're experiencing? Tell me about your story.
I am trying to use the LLM node (specifically with Qwen3.5-plus-chat) in my workflow. When I leave the SYSTEM prompt field empty, the node runs successfully. However, as soon as I add any content to the SYSTEM prompt (e.g., "123"), the node fails immediately.
It was frustrating because the error message indicated a code-level issue rather than a user input error, making it difficult to debug on my end
2. Additional context or comments
Error Message: PluginInvokeError: {"args": {"description": "[Models] Error: 'TextPromptMessageContent' object has no attribute 'startswith'", "error_type": "InvokeError", ...}}
Reproduction: This happens consistently whenever a System Prompt is set. It works fine with no System Prompt.
Hypothesis: The error suggests that the code is trying to call the .startswith() method directly on the TextPromptMessageContent object instead of on the string content within that object. This appears to be a bug in the plugin's pre-processing logic when handling system messages.
Beta Was this translation helpful? Give feedback.
All reactions