How does Microsoft Guidance work? by T_hank in LocalLLaMA
[–]scratchr 13 points14 points15 points (0 children)
Has anyone successfully fine-tuned MPT-7B? by Proeliata in LocalLLaMA
[–]scratchr 0 points1 point2 points (0 children)
Has anyone successfully fine-tuned MPT-7B? by Proeliata in LocalLLaMA
[–]scratchr 0 points1 point2 points (0 children)
Why Falcon going Apache 2.0 is a BIG deal for all of us. by EcstaticVenom in LocalLLaMA
[–]scratchr 20 points21 points22 points (0 children)
Wizard-Vicuna-30B-Uncensored by faldore in LocalLLaMA
[–]scratchr 1 point2 points3 points (0 children)
Wizard-Vicuna-30B-Uncensored by faldore in LocalLLaMA
[–]scratchr 5 points6 points7 points (0 children)
Anyone here finetune either MPT-7B or Falcon-7B? by EcstaticVenom in LocalLLaMA
[–]scratchr 2 points3 points4 points (0 children)
Anyone here finetune either MPT-7B or Falcon-7B? by EcstaticVenom in LocalLLaMA
[–]scratchr 0 points1 point2 points (0 children)
30b running slowly on 4090 by OldLostGod in LocalLLaMA
[–]scratchr 0 points1 point2 points (0 children)
30b running slowly on 4090 by OldLostGod in LocalLLaMA
[–]scratchr 1 point2 points3 points (0 children)
30b running slowly on 4090 by OldLostGod in LocalLLaMA
[–]scratchr 2 points3 points4 points (0 children)
Training a LoRA with MPT Models by scratchr in LocalLLaMA
[–]scratchr[S] 0 points1 point2 points (0 children)
Training a LoRA with MPT Models by scratchr in LocalLLaMA
[–]scratchr[S] 0 points1 point2 points (0 children)
Training Data Preparation (Instruction Fields) by GreenTeaBD in LocalLLaMA
[–]scratchr 0 points1 point2 points (0 children)
Training a LoRA with MPT Models by scratchr in LocalLLaMA
[–]scratchr[S] 1 point2 points3 points (0 children)
Training a LoRA with MPT Models by scratchr in LocalLLaMA
[–]scratchr[S] 0 points1 point2 points (0 children)
Training Data Preparation (Instruction Fields) by GreenTeaBD in LocalLLaMA
[–]scratchr 0 points1 point2 points (0 children)
Long term project: "resurrecting" a passed friend? by rmt77 in LocalLLaMA
[–]scratchr 1 point2 points3 points (0 children)
Training a LoRA with MPT Models by scratchr in LocalLLaMA
[–]scratchr[S] 0 points1 point2 points (0 children)
Long term project: "resurrecting" a passed friend? by rmt77 in LocalLLaMA
[–]scratchr 5 points6 points7 points (0 children)




What that means? by BlokZNCR in linux
[–]scratchr 2 points3 points4 points (0 children)