Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Ollama as other LLM provider #10

Open
arey opened this issue Nov 2, 2024 · 2 comments
Open

Support Ollama as other LLM provider #10

arey opened this issue Nov 2, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@arey
Copy link
Member

arey commented Nov 2, 2024

LangChain4j enables integration with 15+ LLM providers, allowing us to incorporate those that support the specific LangChain4j features used in this application: Streaming, Tools, and JSON mode.
Potential candidates include Google Vertex AI Gemini, Ollama, and Mistral AI.

This task should also take into account LLM switching, possibly through a Maven profile.

@arey arey added the enhancement New feature or request label Nov 2, 2024
@arey arey changed the title Supports other LLM providers Support other LLM providers Nov 2, 2024
@arey arey added the help wanted Extra attention is needed label Nov 9, 2024
@arey arey self-assigned this Dec 29, 2024
@arey arey removed the help wanted Extra attention is needed label Dec 29, 2024
@arey
Copy link
Member Author

arey commented Dec 29, 2024

Test result with Ollama and the llama3.3 model.

Issue 1: toolExecutor is null
When asking the question How many veterinary cardiologists are there? the LLM asks to call the function getAllVets which does not exist.
This is a known bug: langchain4j/langchain4j#1052 and ollama/ollama#8294

Details

2024-12-29T16:28:16.219+01:00 ERROR 58618 --- [nio-8080-exec-2] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: java.lang.NullPointerException: Cannot invoke "dev.langchain4j.service.tool.ToolExecutor.execute(dev.langchain4j.agent.tool.ToolExecutionRequest, Object)" because "toolExecutor" is null] with root cause

java.lang.NullPointerException: Cannot invoke "dev.langchain4j.service.tool.ToolExecutor.execute(dev.langchain4j.agent.tool.ToolExecutionRequest, Object)" because "toolExecutor" is null
at dev.langchain4j.service.AiServiceStreamingResponseHandler.onCompleteResponse(AiServiceStreamingResponseHandler.java:93) ~[langchain4j-1.0.0-alpha1.jar:na]
at dev.langchain4j.model.chat.StreamingChatLanguageModel$1.onComplete(StreamingChatLanguageModel.java:69) ~[langchain4j-core-1.0.0-alpha1.jar:na]
at dev.langchain4j.model.ollama.OllamaClient$2.onResponse(OllamaClient.java:180) ~[langchain4j-ollama-1.0.0-alpha1.jar:na]
at retrofit2.OkHttpCall$1.onResponse(OkHttpCall.java:161) ~[retrofit-2.9.0.jar:na]
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519) ~[okhttp-4.12.0.jar:na]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[na:na]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[na:na]
at java.base/java.lang.Thread.run(Thread.java:1583) ~[na:na]

To remove this hallucination, we try add the following command to the command prompt:

Don't provide a function name that doesn't exist.

When dealing with vets, use the user augmented prompt instead of hallucinating a function tool name that does not exist (ie. getVets, getVetsWithSpecialties, filterVetsBySpecialty).

We also try to decrease the temperature to 0.5 and 0.2

Issue 2: display technical ID

When asking the question Please list the owners that come to the clinic. the LLM may display the ID of each owners.

Solution: add the following command to the command prompt.

Don't display technical information such as ID.

@arey arey changed the title Support other LLM providers Support Ollama as other LLM provider Dec 29, 2024
@arey
Copy link
Member Author

arey commented Feb 1, 2025

I've tried the deepseek-r1:14b model with the LangChain4j 1.0.0-alpha1 version and I've got the following exception:

2025-02-01T19:12:40.061+01:00 ERROR 93418 --- [lhost:11434/...] o.s.s.p.chat.AssistantController         : Unexpected chat error

java.lang.NullPointerException: Cannot invoke "okhttp3.ResponseBody.byteStream()" because the return value of "retrofit2.Response.body()" is null
	at dev.langchain4j.model.ollama.OllamaClient$2.onResponse(OllamaClient.java:164) ~[langchain4j-ollama-1.0.0-alpha1.jar:na]
	at retrofit2.OkHttpCall$1.onResponse(OkHttpCall.java:161) ~[retrofit-2.9.0.jar:na]
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519) ~[okhttp-4.12.0.jar:na]
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[na:na]
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[na:na]
	at java.base/java.lang.Thread.run(Thread.java:1583) ~[na:na]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant