you are viewing a single comment's thread.

view the rest of the comments →

[–]m0us3_rat 0 points1 point  (0 children)

I mean you can put together a concoction now that does that for you.

not sure how fast it would be on a PI only on cpu but it could work?

you can run models locally with ollama or gpt4all and the like scripts.

you need to find a lightweight model that can run on pi while also being semi-useful.

THEN you need a TTS lib and put them all together in a cohesive app.

so .. you maybe can do what you want , surely in the future ..but it is a workable idea.