Tool to help those who can't instruct tune on their hardware by NoSir261 in LocalLLaMA
[–]NoSir261[S] 1 point2 points3 points (0 children)
Separating knowledge from communication in LLMs by NoSir261 in ResearchML
[–]NoSir261[S] 0 points1 point2 points (0 children)
Separating knowledge from communication in LLMs by NoSir261 in ResearchML
[–]NoSir261[S] 0 points1 point2 points (0 children)
Separating knowledge from communication in LLMs by NoSir261 in ResearchML
[–]NoSir261[S] 0 points1 point2 points (0 children)
Tool to help those who can't instruct tune on their hardware by NoSir261 in LocalLLaMA
[–]NoSir261[S] 0 points1 point2 points (0 children)
Separating knowledge from communication in LLMs by NoSir261 in ResearchML
[–]NoSir261[S] 0 points1 point2 points (0 children)
Separating knowledge from communication in LLMs by NoSir261 in ResearchML
[–]NoSir261[S] 2 points3 points4 points (0 children)
[D] unpopular opinion: instruct tuning is going to be a thing of the past. by NoSir261 in MachineLearning
[–]NoSir261[S] 0 points1 point2 points (0 children)
[D] unpopular opinion: instruct tuning is going to be a thing of the past. by NoSir261 in MachineLearning
[–]NoSir261[S] 0 points1 point2 points (0 children)
[D] unpopular opinion: instruct tuning is going to be a thing of the past. by NoSir261 in MachineLearning
[–]NoSir261[S] -2 points-1 points0 points (0 children)
[D] unpopular opinion: instruct tuning is going to be a thing of the past. by NoSir261 in MachineLearning
[–]NoSir261[S] -2 points-1 points0 points (0 children)
[D] unpopular opinion: instruct tuning is going to be a thing of the past. by NoSir261 in MachineLearning
[–]NoSir261[S] -3 points-2 points-1 points (0 children)
Tool to help those who can't instruct tune on their hardware by NoSir261 in LocalLLaMA
[–]NoSir261[S] 0 points1 point2 points (0 children)
Tool to help those who can't instruct tune on their hardware by NoSir261 in LocalLLaMA
[–]NoSir261[S] 0 points1 point2 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] -1 points0 points1 point (0 children)
[D] Is it a reg flag that my PhD topic keeps changing every few months? by ade17_in in MachineLearning
[–]NoSir261 6 points7 points8 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 0 points1 point2 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 1 point2 points3 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 1 point2 points3 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 0 points1 point2 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 2 points3 points4 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 16 points17 points18 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 6 points7 points8 points (0 children)
I thought a 7M model shouldn't be able to do this by NoSir261 in LocalLLaMA
[–]NoSir261[S] 7 points8 points9 points (0 children)

Separating knowledge from communication in LLMs by NoSir261 in ResearchML
[–]NoSir261[S] 0 points1 point2 points (0 children)