-
Notifications
You must be signed in to change notification settings - Fork 11.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Agent mode does not use streaming output, so the conversation experience seems slow #2280
Comments
🤖 Hey @lzk90s, good to see you again! Hope you're doing well. Let's dive into this issue you've raised. To improve the user experience in the agent mode of dify, you could consider implementing a streaming output. This would allow the application to send data in a continuous flow, making it appear faster and more responsive to the user. In the context of your application, you could modify the llm_result: LLMResult = model_instance.invoke_llm(
prompt_messages=prompt_messages,
model_parameters=app_orchestration_config.model_config.parameters,
tools=[],
stop=app_orchestration_config.model_config.stop,
stream=True, # Enable streaming
user=self.user_id,
callbacks=[],
) In this example, setting Also, keep in mind that streaming output can increase the complexity of your application and may require additional resources, such as more memory or CPU. Therefore, it's important to carefully consider the trade-offs before implementing this feature. If you need further assistance or have any other questions, feel free to ask. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
When using a model without function calling ability, |
thanks |
Self Checks
Provide a description of the help you need
assistant_cot_runner.py
Is there any good way to improve the experience?
The text was updated successfully, but these errors were encountered: