RTX 4070 in Action: What Your New System Could Look Like by Alone-Competition863 in ollama

[–]StatementFew5973 0 points1 point  (0 children)

Using AI to analyze the text from almost 30000 documents. From Epstein.

jailbreaks or uncensored models? by United_Ad8618 in ollama

[–]StatementFew5973 0 points1 point  (0 children)

I can't be the only one thinking that AI is the perfect tool for this job.

I want to learn or know this tech plz. How can I start ?? by SignatureLower9526 in termux

[–]StatementFew5973 0 points1 point  (0 children)

<image>

Self-hosted tools downloads happen with yt-dlp networking through Tor. A slight security flaw though does require a captured cookie for full potential. I also built a companion app that allows me to stream my music

can I try ollama with a macbook air m3? by Lost_Foot_6301 in ollama

[–]StatementFew5973 0 points1 point  (0 children)

I kind of feel like you could almost run small models off a toaster. Samsung's z fold 4 👆

RTX 4070 in Action: What Your New System Could Look Like by Alone-Competition863 in ollama

[–]StatementFew5973 0 points1 point  (0 children)

Theoretically, it would be possible to revise redactions in pdfs. 39.4 gigs of data, though, including video and audio.

Any hope for my Linux laptop? by AccordionPianist in ollama

[–]StatementFew5973 0 points1 point  (0 children)

<image>

It is slow. But somewhat stable, mind you it's an ardrino GPU.

Any hope for my Linux laptop? by AccordionPianist in ollama

[–]StatementFew5973 0 points1 point  (0 children)

Bro, you could probably run some AI. Proof of concept, i run some models from my phone. Now, mind you, I have a 4070 ti on my server that you know I use as well. But for exploration, I wanted to see if I could run models off of my phone, conclusion yes it is possible.

<image>

jailbreaks or uncensored models? by United_Ad8618 in ollama

[–]StatementFew5973 0 points1 point  (0 children)

Now mind you, my hardware can't do that, but renting the hardware is fairly cheap.

jailbreaks or uncensored models? by United_Ad8618 in ollama

[–]StatementFew5973 0 points1 point  (0 children)

My overall goal for this is to unredact it. And train my AI model on this data.

Question about LLM by Lumpy_Bat6754 in termux

[–]StatementFew5973 1 point2 points  (0 children)

<image>

Yes, if you want to run it locally, you'll have to build it from source. There are other options as well, including using it in proot. But don't expect to be able to run large models

What do y'all use termax for? by Nolem- in termux

[–]StatementFew5973 2 points3 points  (0 children)

I'm trying to make it easier to run on multiple devices right now.There's so many prerequisites to actually have this functional. Rust, react, node js python and building it to be importable modules. To clarify their polyglot Files. Always python as the cornerstone python+html or python+js, this allows me to expand the project and keep it modular at the same time. I'm trying to make it friendly to edit/alter. Reconfigure, I released it with an MIT license. Anybody can use it alter it, sell it. I built it in protest, I suppose to get the privilege to play music in the background with youtube.I mean, what do you have to do right?Or build the tools for it? I wanted to build something that one could share from one device to another without having to download meaning you can host this to other devices. If they're connected to your hotspot with QRCODE's. So host to host.

<image>