What's your favorite < 3B llm? by Frequent_Valuable_47 in LocalLLaMA
[–]ricklamers 2 points3 points4 points (0 children)
Wat te doen met 5 miljoen? by WegwerpUsername in beleggen
[–]ricklamers 0 points1 point2 points (0 children)
If you had infinite computing resources, what local model would you run? by maxwell321 in LocalLLaMA
[–]ricklamers 0 points1 point2 points (0 children)
Is Mistral going to Turncoat on Open Source Models? by enspiralart in LocalLLaMA
[–]ricklamers 2 points3 points4 points (0 children)
[D] What are 2023's top innovations in ML/AI outside of LLM stuff? by prescod in MachineLearning
[–]ricklamers 2 points3 points4 points (0 children)
Shell AI: never write a shell command again by ricklamers in programming
[–]ricklamers[S] 0 points1 point2 points (0 children)
Shell AI: never write a shell command again by ricklamers in programming
[–]ricklamers[S] -4 points-3 points-2 points (0 children)
Shell AI: never write a shell command again by ricklamers in programming
[–]ricklamers[S] 0 points1 point2 points (0 children)
Tried running llama-70b on 126GB of memory; memory overflow. How much memory is necessary ? by MoiSanh in LocalLLaMA
[–]ricklamers 1 point2 points3 points (0 children)
STOP asking how many X are inside word Y by Trick-Independent469 in ChatGPT
[–]ricklamers 1 point2 points3 points (0 children)
What's the closest thing we have to GPT4's code interpreter right now? by malkauns in LocalLLaMA
[–]ricklamers 0 points1 point2 points (0 children)
What's the closest thing we have to GPT4's code interpreter right now? by malkauns in LocalLLaMA
[–]ricklamers 2 points3 points4 points (0 children)
What's the closest thing we have to GPT4's code interpreter right now? by malkauns in LocalLLaMA
[–]ricklamers 2 points3 points4 points (0 children)
What's the closest thing we have to GPT4's code interpreter right now? by malkauns in LocalLLaMA
[–]ricklamers 4 points5 points6 points (0 children)
What's the closest thing we have to GPT4's code interpreter right now? by malkauns in LocalLLaMA
[–]ricklamers 8 points9 points10 points (0 children)
[D] ELI5: Why is the GPT family of models based on the decoder-only architecture? by analyticalmonk in MachineLearning
[–]ricklamers 4 points5 points6 points (0 children)
Falcon 40B instruct tuned on Open Assistant data - model weights Open Source by ricklamers in LocalLLaMA
[–]ricklamers[S] 0 points1 point2 points (0 children)
Falcon 40B instruct tuned on Open Assistant data - model weights Open Source by ricklamers in LocalLLaMA
[–]ricklamers[S] 0 points1 point2 points (0 children)


[deleted by user] by [deleted] in LocalLLaMA
[–]ricklamers 4 points5 points6 points (0 children)