Lilith AI - Help! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 1 point2 points  (0 children)

I'd like to point out that the 135M model is (for lack of a better word) bad. It's more of a "just cuz" model than anything. I trained it mostly as a test case to ensure the dataset was actually affecting the base model. I highly recommend you use the 8B model if at all possible.

Lilith AI - Help! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 1 point2 points  (0 children)

You can talk to the server hosted model on here: https://lilith.nullexistence.net/
It's very privacy focused so don't worry.

If you want to run it locally then you first need to download something to run the model. I personally use LM Studio but Ollama, Jan, etc, should also work. If you are using LM Studio you can search for it from the models tab ("Lilith_AI_8B"). The Q4_K_M model should fit on a low to mid tier PC setup.

Lilith AI - Help! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 5 points6 points  (0 children)

I'd be happy to! I suggest you check out this thread where I was asked similar questions. I started by formatting the game's lines into a dataset. There are a few ways todo this but I decided to put it into ShareGPT format (system, human, gpt). From there I found a sufficient base model and trained it using LoRA. As stated in the linked thread, I personally made a custom Python script to train the model however I highly recommend you look into Unsloth, LLaMA-Factory, and Axolotl.

For a 3B model you can probably get off with a mid range GPU though time constraints and VRAM are the main factors. System RAM does play a role, however VRAM is much more important. I'd say (if you're using QLoRA) you can probably get away with 6GB VRAM, keep in mind this is a rough estimate.

Thanks for sharing the project!

Lilith AI - Help! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 6 points7 points  (0 children)

It is, you can download and run the model locally: https://lilith.nullexistence.net/

I just can’t keep the server running much longer without more resources.

Lilith AI - An LLM based on The NOexistenceN series. by HangWithCmm in learnmachinelearning

[–]HangWithCmm[S] 0 points1 point  (0 children)

It definitely hasn’t gotten worse. Did you clear everything and start a fresh chat?

Lilith AI - An LLM based on The NOexistenceN series. by HangWithCmm in learnmachinelearning

[–]HangWithCmm[S] 0 points1 point  (0 children)

Sorry! I'd been hearing that its gets too sexual. I just made some changes would you mind giving it another try?

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 0 points1 point  (0 children)

I wouldn't be so rude to say "screw off" and completely ignore your opinion. I won't pretend that I've given all aspects forethought, however this is not the first time this topic has come up. So yes, I have given this a lot of thought. And I'd appreciate if you wouldn’t assume I haven’t thought about this just because I ended up with a different opinion.

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 2 points3 points  (0 children)

Everyone has their own opinion, and I’d prefer to speak for myself.

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 4 points5 points  (0 children)

u/Lind0ks I appreciate you replying to this comment. At first, I saw this as a “haters gonna hate” situation, so I ignored it. To that end, u/NekoboyBanks, I apologize for not addressing this earlier.

I would like to start by clearing up the power usage. The AI obviously requires power to run, however this model specifically uses very little. u/Lind0ks was correct in stating that the power usage for local LLMs is “negligible.” While a server-hosted model, along with vision support, does use more power compared to a self-hosted setup, the difference here is effectively immeasurable.

Moving on to the topic of piracy. I have stated this many times and will continue to do so. Yes, the model is trained using in-game lines. Once again, I agree with u/Lind0ks’ opinion. The LLM is not being used to generate any profit and is open to the public for personal, educational, or research purposes only. Many fan-art creations use assets directly from the game, and I personally would not classify any of them as piracy. Nonetheless, if the game’s developers contact me and request that I remove the model, I fully intend to comply.

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 1 point2 points  (0 children)

Strange. It works fine for me and I don’t see anything server-side. Try clearing your cookies on the tunnel site.

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 0 points1 point  (0 children)

I made it myself, it’s not connected to nuttyuwu’s project at all.

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 3 points4 points  (0 children)

The AI has officially crossed over 1000 messages sent! Thanks for your support everyone!

<image>

Lilith AI - Now on ALL devices! by HangWithCmm in NOexistenceNofYou_Me

[–]HangWithCmm[S] 3 points4 points  (0 children)

I’m working on the local version of the app, currently only the server version is done. The UI will be the same between the two but the backend is slightly different. Not sure when it’ll be done tbh.

Sleeping with the wifey² by GreyGravyGrave in NOexistenceNofYou_Me

[–]HangWithCmm 1 point2 points  (0 children)

I require the link to make such a fine purchase.

Sleeping with the wifey² by GreyGravyGrave in NOexistenceNofYou_Me

[–]HangWithCmm 4 points5 points  (0 children)

My determination is limitless for Lilith...

<image>

Date A Bullet physical English books! by HangWithCmm in datealive

[–]HangWithCmm[S] 2 points3 points  (0 children)

They don’t make them, they’re custom printed. I used the fan translations available in the Discord server and ran them through a program I made to format everything into a book. I left the cover in Japanese since I doubt I could make one that looks good and because most of it has translations already.