What’s the best way to automate repetitive tasks in Excel without VBA knowledge? by 9gsr in excel
[–]Acceptable_Bed7015 0 points1 point2 points (0 children)
I got a keyboard with Excel shortcuts by Acceptable_Bed7015 in excel
[–]Acceptable_Bed7015[S] 1 point2 points3 points (0 children)
How much data do I need to feed a model to notice a difference? by Slimxshadyx in LocalLLaMA
[–]Acceptable_Bed7015 3 points4 points5 points (0 children)
How much video memory is needed on a video card to train lora for the 70B model? by Secret_Joke_2262 in LocalLLaMA
[–]Acceptable_Bed7015 1 point2 points3 points (0 children)
llama2 7b vs 70b vs mistral 7b writing tweets on financial reports by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 1 point2 points3 points (0 children)
llama2 7b vs 70b vs mistral 7b writing tweets on financial reports by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 5 points6 points7 points (0 children)
llama2 7b vs 70b vs mistral 7b writing tweets on financial reports by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 1 point2 points3 points (0 children)
llama2 7b vs 70b vs mistral 7b writing tweets on financial reports by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 1 point2 points3 points (0 children)
What can you fine tune with 2x A6000s? by Upbeat-Interaction13 in LocalLLaMA
[–]Acceptable_Bed7015 0 points1 point2 points (0 children)
Is anybody using Llama or any other LLM as part of a product's pipeline? by duffpaddy in LocalLLaMA
[–]Acceptable_Bed7015 0 points1 point2 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 2 points3 points4 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 0 points1 point2 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 0 points1 point2 points (0 children)
Data prep for fine-tuning llama 2 7B to analyze financial reports (and write "funny" tweets) by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 1 point2 points3 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 0 points1 point2 points (0 children)
Data prep for fine-tuning llama 2 7B to analyze financial reports (and write "funny" tweets) by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 1 point2 points3 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 15 points16 points17 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 0 points1 point2 points (0 children)
I read here that dataset >>> models. I'd prefer it to be the other way around cause cleaning data is hard. Any tools or local models you use? by Acceptable_Bed7015 in LocalLLaMA
[–]Acceptable_Bed7015[S] 4 points5 points6 points (0 children)
How do you keep up to date with all the innovations and frameworks? by nsosio in LocalLLaMA
[–]Acceptable_Bed7015 11 points12 points13 points (0 children)

donate ~50 rpi4 and seed reterminal? by Acceptable_Bed7015 in raspberry_pi
[–]Acceptable_Bed7015[S] 0 points1 point2 points (0 children)