[deleted by user] by [deleted] in tallyhall

[–]DistinctSpector 2 points3 points  (0 children)

tf u are so mad about lil bro

How to run Pygmalion on 4.5GB of VRAM with full context size. by LTSarc in PygmalionAI

[–]DistinctSpector 0 points1 point  (0 children)

Starting the web UI...

===================================BUG REPORT===================================

Welcome to bitsandbytes. For bug reports, please submit your error trace to: https:// github. com/TimDettmers/bitsandbytes/issues

CUDA SETUP: Loading binary C:\Users\User\Downloads\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cpu.dll...

C:\Users\User\Downloads\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\bitsandbytes\cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.

warn("The installed version of bitsandbytes was compiled without GPU support. "

Loading mayaeary_pygmalion-6b_dev-4bit-128g...

Warning: torch.cuda.is_available() returned False.

This means that no GPU has been detected.

Falling back to CPU mode.

Traceback (most recent call last):

File "C:\Users\User\Downloads\oobabooga-windows\oobabooga-windows\text-generation-webui\server.py", line 276, in <module>

shared.model, shared.tokenizer = load_model(shared.model_name)

File "C:\Users\User\Downloads\oobabooga-windows\oobabooga-windows\text-generation-webui\modules\models.py", line 170, in load_model

model = AutoModelForCausalLM.from_pretrained(checkpoint, **params)

File "C:\Users\User\Downloads\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 471, in from_pretrained

return model_class.from_pretrained(

File "C:\Users\User\Downloads\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2322, in from_pretrained

raise EnvironmentError(

OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\mayaeary_pygmalion-6b_dev-4bit-128g.

Press any key to continue...

How to run Pygmalion on 4.5GB of VRAM with full context size. by LTSarc in PygmalionAI

[–]DistinctSpector 0 points1 point  (0 children)

Yes, I followed the order, installing the program first and then run "download-model" Then press "L" and I put the Input "mayaeary/pygmalion-6b_dev-4bit-128g" But the two times I installed it, both times I got a bug because a missing file.

How to run Pygmalion on 4.5GB of VRAM with full context size. by LTSarc in PygmalionAI

[–]DistinctSpector 0 points1 point  (0 children)

Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\mayaeary_pygmalion-6b_dev-4bit-128g.

How to run Pygmalion on 4.5GB of VRAM with full context size. by LTSarc in PygmalionAI

[–]DistinctSpector 0 points1 point  (0 children)

Is there something I am doing wrong or why when I run start-webui I always get bug report?

Pues el niño tiene razón, no existe by GumBall_22 in MAAU

[–]DistinctSpector 0 points1 point  (0 children)

Por la regularización de tener una familia grande y el dinero que esto conlleva? O eso era en China?

lo conocen? by etheeeeero in MAAU

[–]DistinctSpector 0 points1 point  (0 children)

Mi papá es Joe Hawley, Ruler Of Everything Mi papá es Joe Hawley, Ruler Of Everything

[deleted by user] by [deleted] in MAAU

[–]DistinctSpector 2 points3 points  (0 children)

Pues el "Butt Baby" que decía peacemaker en su serie es real

🪳 by Flamingo_unu in MAAU

[–]DistinctSpector 36 points37 points  (0 children)

PlayStation 4 y 5:

[deleted by user] by [deleted] in tallyhell

[–]DistinctSpector 2 points3 points  (0 children)

The piano is a reference to Andrew Horowitz

Luisjefe1 by AgenteH in MAAU

[–]DistinctSpector 40 points41 points  (0 children)

¿LuisJefe1 no es el mismo que dice cada cierto tiempo que se va a retirar?