Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: Tool is always called with a string parameter instead of Model despite using OPENAI_FUNCTIONS agent #8042

Closed
emmanuelvisage opened this issue Jul 20, 2023 · 2 comments
Labels
Ɑ: agent Related to agents module 🤖:question A specific question about the codebase, product, project, or how to use a feature

Comments

@emmanuelvisage
Copy link

Issue you'd like to raise.

Hi,

I have defined an OPENAI_FUNCTIONS agent. I have created a tool from a function where I have defined a BaseModel as input parameter

class FooInputModel(BaseModel):
    id: str 
    name: str

agent_kwargs = {
    "extra_prompt_messages": [MessagesPlaceholder(variable_name="memory")]
}
memory = ConversationBufferMemory(memory_key="memory", return_messages=True)

tool= Tool.from_function(
      name= "FooGenerator",
      description= "Foo the bar",
      func=foo,
      args_schema= FooInputModel
    )

agent = initialize_agent([tool],
                         llm,
                         agent=AgentType.OPENAI_FUNCTIONS,
                         agent_kwargs=agent_kwargs,
                         memory=memory)

my function foo is properly called when necessary, however the input is always a string whereas I would like a "FooInputModel". How can I achieve this ? And how can I see if the agent is actually using the functions calling from OpenAI because I have doubts it's working and when I print the agent I don't see any FunctionMessage in the history.

Thanks

Suggestion:

No response

@dosubot dosubot bot added Ɑ: agent Related to agents module 🤖:question A specific question about the codebase, product, project, or how to use a feature labels Jul 20, 2023
@dosubot
Copy link

dosubot bot commented Jul 21, 2023

Answer generated by a 🤖

Answer

I understand that you're having trouble with the input to your function being treated as a string, even though you've specified a BaseModel as the input. You're also unsure if the agent is using the functions from OpenAI as expected, as you're not seeing any FunctionMessage in the history.

The issue with the input being treated as a string likely arises from using Tool.from_function to create your tool. This method doesn't automatically convert the input to the BaseModel you've specified. Instead, it treats the input as a string.

To address this, you can create a custom tool class, similar to how CurrentStockPriceTool and StockPerformanceTool are defined. In these classes, the _run method takes arguments that match the fields of the BaseModel specified in args_schema. The agent will then automatically convert the input to this model before calling _run.

Here's how you can define your FooGenerator tool:

from typing import Type
from pydantic import BaseModel
from langchain.tools import BaseTool

class FooInputModel(BaseModel):
    id: str 
    name: str

class FooGenerator(BaseTool):
    name = "FooGenerator"
    description = "Foo the bar"
    args_schema: Type[BaseModel] = FooInputModel

    def _run(self, id: str, name: str):
        return foo(id, name)

    def _arun(self, id: str, name: str):
        raise NotImplementedError("FooGenerator does not support async")

Then, you can initialize your agent with this tool:

tools = [FooGenerator()]

agent = initialize_agent(tools, llm, agent=AgentType.OPENAI_FUNCTIONS, agent_kwargs=agent_kwargs, memory=memory)

As for your second question, the OPENAI_FUNCTIONS agent doesn't print FunctionMessage in the history. It only prints the input and output of the conversation. If you want to see the function calls, you can set verbose=True when initializing the agent. This will print detailed logs, including the function calls.

This is one potential solution to your issue. If you find this solution helpful and believe it could benefit others, I encourage you to make a pull request to update the documentation. This way, other users who encounter the same issue can find this solution more easily.

Sources


This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

@emmanuelvisage
Copy link
Author

Yep, spot on. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: agent Related to agents module 🤖:question A specific question about the codebase, product, project, or how to use a feature
Projects
None yet
Development

No branches or pull requests

1 participant