Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Functions specified in Chat Options is not returned in the getToolCalls() method #1362

Open
iAMSagar44 opened this issue Sep 15, 2024 · 4 comments
Assignees

Comments

@iAMSagar44
Copy link
Contributor

Bug description
I have two functions that I have specified in Chat Options. I can see from the logs that these functions are getting invoked by the LLM, but when I try to retrieve the function details using the getToolCalls() method they are not being returned (even when the functions have been invoked by the LLM).

Code snippet -

 Prompt prompt = new Prompt(List.of(systemMessage, userMessage),
                                OpenAiChatOptions.builder().withTemperature(0.7f)
                                                .withModel("gpt-4o")
                                                .withFunction("findPapers")
                                                .withFunction("summarizePaper")
                                                .withParallelToolCalls(false)
                                                .build());
                Flux<ChatResponse> chatResponseStream = chatModel.stream(prompt);

                chatResponseStream.map(response -> response.getResult().getOutput().getToolCalls())
                                .doOnNext(toolCalls -> {
                                        logger.info("Tool calls: {}", toolCalls); // returns an empty list when the function has actually been invoked
                                        }
                                })
                                .onErrorContinue((e, o) -> logger.error("Error occurred while processing chat response",
                                                e))
                                .subscribe();

Environment
Spring AI version - 1.0.0-M2

Steps to reproduce
Please see sample code above for the steps.

Expected behavior
Expecting the getTools() method to return the list of functions invoked by the LLM.

@JogoShugh
Copy link

JogoShugh commented Oct 8, 2024

How did you find logs to show you that? I am having the same problem. A few months ago, I was auto-generating json schemas for some kotlin data classes and populating them into my system message, but I figured I would try this Function support that auto-generates it too but is baked into Spring AI formally....but I'm seeing the same issue as you and trying to troubleshoot now.

@asaikali asaikali added openai function calling bug Something isn't working labels Oct 30, 2024
@majian159
Copy link

+1

@markpollack markpollack added this to the 1.0.0-M4 milestone Nov 5, 2024
@ilayaperumalg ilayaperumalg removed this from the 1.0.0-M4 milestone Nov 8, 2024
@markpollack
Copy link
Member

The misconception is that the call response.getResult().getOutput().getToolCalls() would return the conversation that happened back and forth with the model for the tool requests, there can be multiple tool calls.

The only way now to gain access to that conversation is to take over control of the function calling youself via the so called 'proxy' feature in Spring AI. An example of getting close to the conversation is here.

That said, we would like to collect this information for you so you don't have to get down so low into the code. To achieve that we need to improve the class ChatGenerationMetadata to contain a hashmap. I've created an issue for this #1722

@markpollack markpollack added enhancement New feature or request and removed bug Something isn't working labels Nov 11, 2024
@gaplo917
Copy link

Similar to what @markpollack said, I experience the same when using Ollama sync case, you need to add this line to get control.

        val prompt = Prompt(
            "what's news in kotlin 2.1",
            OllamaOptions.builder()
                .withFunctionCallbacks(listOf(getExternalKnowledge))
                .withToolContext(mapOf("userId" to "user123"))
+                .withProxyToolCalls(true)
                .build()
        )

When withProxyToolCalls(false)
image

When withProxyToolCalls(true), you will get back the control but you need to trigger the function on your own.
image

@markpollack There is also a bug on calculating the token usage, when withProxyToolCalls(false). It doesn't calculate the tools call usage. I'm not sure if other platforms would have the same behaviour.

Suggestion

Capture all the generations into results list.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants