-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Ollama as other LLM provider #10
Comments
Test result with Ollama and the Issue 1: toolExecutor is null Details2024-12-29T16:28:16.219+01:00 ERROR 58618 --- [nio-8080-exec-2] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: java.lang.NullPointerException: Cannot invoke "dev.langchain4j.service.tool.ToolExecutor.execute(dev.langchain4j.agent.tool.ToolExecutionRequest, Object)" because "toolExecutor" is null] with root cause java.lang.NullPointerException: Cannot invoke "dev.langchain4j.service.tool.ToolExecutor.execute(dev.langchain4j.agent.tool.ToolExecutionRequest, Object)" because "toolExecutor" is null To remove this hallucination, we try add the following command to the command prompt:
We also try to decrease the temperature to Issue 2: display technical ID When asking the question Solution: add the following command to the command prompt.
|
I've tried the 2025-02-01T19:12:40.061+01:00 ERROR 93418 --- [lhost:11434/...] o.s.s.p.chat.AssistantController : Unexpected chat error
java.lang.NullPointerException: Cannot invoke "okhttp3.ResponseBody.byteStream()" because the return value of "retrofit2.Response.body()" is null
at dev.langchain4j.model.ollama.OllamaClient$2.onResponse(OllamaClient.java:164) ~[langchain4j-ollama-1.0.0-alpha1.jar:na]
at retrofit2.OkHttpCall$1.onResponse(OkHttpCall.java:161) ~[retrofit-2.9.0.jar:na]
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519) ~[okhttp-4.12.0.jar:na]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[na:na]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[na:na]
at java.base/java.lang.Thread.run(Thread.java:1583) ~[na:na] |
LangChain4j enables integration with 15+ LLM providers, allowing us to incorporate those that support the specific LangChain4j features used in this application: Streaming, Tools, and JSON mode.
Potential candidates include Google Vertex AI Gemini, Ollama, and Mistral AI.
This task should also take into account LLM switching, possibly through a Maven profile.
The text was updated successfully, but these errors were encountered: