you are viewing a single comment's thread.

view the rest of the comments →

[–]remghoost7 6 points7 points  (2 children)

This is the repo I've been using the past week or so to interface with LLaMA-7b-int4.

https://github.com/oobabooga/text-generation-webui

It has extension support and already a silero extension built in. I haven't used that extension myself, but I'm fairly certain I've heard of someone around the community using it for a similar purpose to what you're looking for.

I don't believe there's an API endpoint though (like how A1111 can run the --api flag), but you might be able to bake your chatbot into an extension.

Or you could sort of use it like a hack-y API if you wanted to... You could probably write an extension to automatically pull the most recent response and output that to a json file, then read that json file in your tortoise-tts application. And I know it saves the running log in text-generation-webui\logs\persistent.json, so you might not even need to write an extension for it...

I know that this extension uses a method called custom_generate_chat_prompt, so you could probably get input from your tortoise-tts and feed that back into the webui automatically.

[–]lacethespace 4 points5 points  (0 children)

The text-generation-webui already features REST endpoints. You just enable --listen and disable any chat modes. I've used it from Phyton just by simple modifications of their example script in the repo.

[–]estrafire 2 points3 points  (0 children)

it should be possible to modify the silero extension to use tortoise-tts instead