-
Notifications
You must be signed in to change notification settings - Fork 842
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Functions specified in Chat Options is not returned in the getToolCalls() method #1362
Comments
How did you find logs to show you that? I am having the same problem. A few months ago, I was auto-generating json schemas for some kotlin data classes and populating them into my system message, but I figured I would try this Function support that auto-generates it too but is baked into Spring AI formally....but I'm seeing the same issue as you and trying to troubleshoot now. |
+1 |
The misconception is that the call The only way now to gain access to that conversation is to take over control of the function calling youself via the so called 'proxy' feature in Spring AI. An example of getting close to the conversation is here. That said, we would like to collect this information for you so you don't have to get down so low into the code. To achieve that we need to improve the class |
Similar to what @markpollack said, I experience the same when using Ollama sync case, you need to add this line to get control. val prompt = Prompt(
"what's news in kotlin 2.1",
OllamaOptions.builder()
.withFunctionCallbacks(listOf(getExternalKnowledge))
.withToolContext(mapOf("userId" to "user123"))
+ .withProxyToolCalls(true)
.build()
) When When @markpollack There is also a bug on calculating the token usage, when SuggestionCapture all the generations into |
Bug description
I have two functions that I have specified in Chat Options. I can see from the logs that these functions are getting invoked by the LLM, but when I try to retrieve the function details using the
getToolCalls()
method they are not being returned (even when the functions have been invoked by the LLM).Code snippet -
Environment
Spring AI version - 1.0.0-M2
Steps to reproduce
Please see sample code above for the steps.
Expected behavior
Expecting the
getTools()
method to return the list of functions invoked by the LLM.The text was updated successfully, but these errors were encountered: