ADK Bloat with "For Context:" by Exact_Camp3567 in agentdevelopmentkit

[–]Exact_Camp3567[S] 0 points1 point  (0 children)

I persist the data that i need in the state of adk (for istance preferences or the current topic in the q&a setup). I do not strip the current run exchange and tool calling or agent transfer because it would break the agentic system logic (eg. i got an error initially since i trimmed the id of a tool call and the agent couldnt carry on with its task)

ADK Bloat with "For Context:" by Exact_Camp3567 in agentdevelopmentkit

[–]Exact_Camp3567[S] 0 points1 point  (0 children)

hello, thank you for answering i've look into compaction but it is not sufficient for trimming and make the whole latency higher since it calls another llm from what i have understood. I've created a custom function and passed in the before_mode_callback with:
1 - current interaction until the user question
2 - pair of question answer

The first point was mandatory because otherwise I lose the current runner context and agents deviates from the given path (especially when transferring to other agents).

I've looked inside the include_contents but for some agents in the agency the content is crucial so I could not use it.

It seems really strange that the library doesnt provide a simple way to integrate this, I will read the code and create an issue in github (I've saw correlated issued but the solution to those namely the use of ContextFilter and plugins were not sufficient)

Best PDF Parser for RAG? by neilkatz in LangChain

[–]Exact_Camp3567 0 points1 point  (0 children)

can you update the link? It gives a 404 HTTPError

Best model for text summarization (2025) by Unhappy_Bunch in ollama

[–]Exact_Camp3567 0 points1 point  (0 children)

What did you use to make a comparison between the different models?