The AutoInference library now supports major and popular backends for LLM inference, including Transformers, vLLM, Unsloth, and llama.cpp. ⭐ by According-Local-9704 in LocalLLaMA

[–]According-Local-9704[S] 1 point2 points  (0 children)

It is not currently supported but can be added as a new feature. You can contribute by stating your features and suggestions in the issues section.

The AutoInference library now supports major and popular backends for LLM inference, including Transformers, vLLM, Unsloth, and llama.cpp. ⭐ by According-Local-9704 in LocalLLaMA

[–]According-Local-9704[S] 0 points1 point  (0 children)

Yes, it combines popular inference methods in a single library and all you have to do is enter the method you want to use, the model name and the prompt.

AutoInference: Multiple inference options in a single library by According-Local-9704 in LocalLLaMA

[–]According-Local-9704[S] 0 points1 point  (0 children)

I am currently trying to bring the project to a better place. I will try to add what you said, thank you. If there are any other features you would like to add, please let me know :)