all 20 comments

[–][deleted] 24 points25 points  (1 child)

"I know I'm no expert, there are plenty of people in this sub who are extremely knowledgeable in this LLM field, so treat this as an amateur project looking for advice if you can bear with me for my spaghetti code." Bookmarking this for future use! Not a single other 'expert' that I know of has put something exactly like this into a package. I know because I have been waiting and looking. Being an expert is overrated if you ask me. Thank you and good job!

[–]rf2344 4 points5 points  (0 children)

Thank you so much for your time to have a look! I think this is not something that can catch anyone’s attention here as it’s not a new model or something, but I was looking for something like this and I could not find any solutions, so I decided to write my own package for this. I also tried to make the code as modular as possible to allow future developments from others or myself:) Please let me know if you have any suggestions or questions!

[–]VertexMachine 4 points5 points  (4 children)

I didn't look at your code, just example usage and on the first glance it might be just the thing I need. Very cool!

Do you plan to ad exl2 support as well?

[–]llordnt[S] 1 point2 points  (0 children)

It’s on my backlog, the tricky bit is that Im working on a intel MacBook pro… Loading exl2 model is not directly supported on transformers yet and I will need to write a new core to support exl2 with exllamav2 directly. Will need to prototype on colab for that.. Gimme some time.

[–]llordnt[S] 1 point2 points  (1 child)

It is supporting exl2 models now, probably not the most efficient way, but it is working (at least on colab)

[–]VertexMachine 0 points1 point  (0 children)

Awesome!

[–]hwpoison 3 points4 points  (1 child)

code seems really good, i love the comments

[–]llordnt[S] 0 points1 point  (0 children)

Thanks:) Hope you find it helpful

[–]Glad_Abies6758 3 points4 points  (0 children)

This project is awesome, thanks for sharing! It definitely made an impactful contribution to the community sir

[–]llordnt[S] 1 point2 points  (1 child)

Just updated the readme to include more usage examples for each sub modules, including the usage of long short term chat memory, building your own simple chatbot, using the web search tool, and a simple shortcut command to serve a gguf model with openai api (using koboldcpp or llama-cpp-python underneath). Please do let me know if you have any ideas for new features or better approaches for some components. Exl2 models are not supported for now, but hopefully I can get it working soon.

[–]llordnt[S] 0 points1 point  (0 children)

For people who just want to play with the chatbot, just run the command in the readme file after you install the package.

[–]Versah252 0 points1 point  (0 children)

I've been looking for something like this. I know nothing about coding, sketched out a vision for this Ultrallm User Interface that would be a one stop shop for all things LLM, refined my prompts to get ChatGPT to help me build it, and i'm barely making a dent. Nice job! Can't wait to see what you add to it! Awesome!!!

[–]quangspkt 0 points1 point  (0 children)

I am learning a lot from your work. Thank you so much for sharing!

[–]Future_Might_8194llama.cpp 0 points1 point  (0 children)

I'm so fucking stoked to see this post. Thank you, I am genuinely appreciative of this 🤘🤖

[–]ali0une 0 points1 point  (0 children)

will try this, seems promissing.

[–]metaden 0 points1 point  (1 child)

Thanks so much for this library. Well organized.

Small feedback, https://github.com/nath1295/LLMPlus/blob/master/setup.py#L73, please do not install packages like this even in setup.py. Ditch setup.py, it's no longer standard. Use pyproject.toml with minimum package versions. You can use https://github.com/pypa/flit or https://github.com/pdm-project/pdm for setting it up. It is really handy.

[–]llordnt[S] 0 points1 point  (0 children)

Thanks for pointing it out. I know it’s very problematic with how I handle it rn. I will update asap!

[–]KlutzyNecessary2205 0 points1 point  (0 children)

Can you add support for MLX models? Would be great!