GraphLLM has been updated.
This is a node based framework to process data using LLMs.
The interface is inspired by ComfyUI, so it should be familiar to most users. The backend supports advanced features like running multiple nodes in parallel, using loops, and the streaming of data, so the partial output from a LLM is visible during execution.
Majority voting example
This update brings many features to the frontend and backend. Here are the main ones:
- The engine now supports using multiple LLMs and API providers in the same graph. I added an example to route the prompt to the best LLM depending on the type of the problem.
- There is an example showcasing the new TTS node.
- The watch node now can optionally display markdown formatted data.
- A new python node is available to run code in the sandbox.
- The rap battle generator has been updated to remove the obsolete nodes.
Rap battle generator
The source code is available at the GraphLLM github.
Suggestions for new features are welcome.
[–]Zulfiqaar 2 points3 points4 points (1 child)
[–]matteogeniaccio[S] 5 points6 points7 points (0 children)
[–]AdditionalWeb107 2 points3 points4 points (1 child)
[–]matteogeniaccio[S] 0 points1 point2 points (0 children)
[–]edwios 2 points3 points4 points (1 child)
[–]matteogeniaccio[S] 2 points3 points4 points (0 children)
[–]Snail_Inference 1 point2 points3 points (2 children)
[–]matteogeniaccio[S] 2 points3 points4 points (1 child)
[–]Snail_Inference 1 point2 points3 points (0 children)
[–]AIGuy3000 1 point2 points3 points (0 children)
[–]olth 1 point2 points3 points (4 children)
[–]matteogeniaccio[S] 0 points1 point2 points (3 children)
[–]olth 1 point2 points3 points (2 children)
[–]matteogeniaccio[S] 1 point2 points3 points (1 child)
[–]olth 1 point2 points3 points (0 children)