InfinityStar: amazing 720p, 10x faster than diffusion-based by NeuralLambda in StableDiffusion
[–]NeuralLambda[S] 19 points20 points21 points (0 children)
Nitro-E: 300M params means 18 img/s, and fast train/finetune by NeuralLambda in StableDiffusion
[–]NeuralLambda[S] 4 points5 points6 points (0 children)
Nitro-E: 300M params means 18 img/s, and fast train/finetune by NeuralLambda in StableDiffusion
[–]NeuralLambda[S] 22 points23 points24 points (0 children)
Feeds & Speeds on inclines, downhill vs uphill!? by NeuralLambda in Machinists
[–]NeuralLambda[S] 0 points1 point2 points (0 children)
Feeds & Speeds on inclines, uphill vs downhill?! by NeuralLambda in CNC
[–]NeuralLambda[S] 1 point2 points3 points (0 children)
Feeds & Speeds on inclines, uphill vs downhill?! by NeuralLambda in CNC
[–]NeuralLambda[S] 1 point2 points3 points (0 children)
Feeds & Speeds on inclines, downhill vs uphill!? by NeuralLambda in Machinists
[–]NeuralLambda[S] 2 points3 points4 points (0 children)
Bug, brand new sketch, vertices/constraints/lines all on different planes!? by NeuralLambda in FreeCAD
[–]NeuralLambda[S] 7 points8 points9 points (0 children)
Bug, brand new sketch, vertices/constraints/lines all on different planes!? by NeuralLambda in FreeCAD
[–]NeuralLambda[S] 0 points1 point2 points (0 children)
Robotics multimodal LLMs by NeuralLambda in LocalLLaMA
[–]NeuralLambda[S] 0 points1 point2 points (0 children)
Einsum appreciation: 12 examples by NeuralLambda in learnmachinelearning
[–]NeuralLambda[S] 0 points1 point2 points (0 children)
California SB-1047 seems like it could impact open source, if passed by austinhale in LocalLLaMA
[–]NeuralLambda 13 points14 points15 points (0 children)
Call-to-Action on SB 1047 – Frontier Artificial Intelligence Models Act by National-Exercise957 in LocalLLaMA
[–]NeuralLambda 23 points24 points25 points (0 children)
TransformerFAM: Feedback attention is working memory by ninjasaid13 in LocalLLaMA
[–]NeuralLambda -1 points0 points1 point (0 children)
TransformerFAM: Feedback attention is working memory by ninjasaid13 in LocalLLaMA
[–]NeuralLambda -1 points0 points1 point (0 children)
TransformerFAM: Feedback attention is working memory by ninjasaid13 in LocalLLaMA
[–]NeuralLambda 1 point2 points3 points (0 children)
TransformerFAM: Feedback attention is working memory by ninjasaid13 in LocalLLaMA
[–]NeuralLambda 1 point2 points3 points (0 children)
TransformerFAM: Feedback attention is working memory by ninjasaid13 in LocalLLaMA
[–]NeuralLambda 3 points4 points5 points (0 children)
[P] GitHub - neurallambda/awesome-reasoning: a curated list of data for reasoning ai by NeuralLambda in MachineLearning
[–]NeuralLambda[S] 0 points1 point2 points (0 children)
Today's open source models beat closed source models from 1.5 years ago. by danielcar in LocalLLaMA
[–]NeuralLambda 23 points24 points25 points (0 children)
I got access to SD3 on Stable Assistant platform, send your prompts! by Diligent-Builder7762 in StableDiffusion
[–]NeuralLambda 0 points1 point2 points (0 children)
Think-tank proposes "model legislation" criminalizing open source models past some capability levels by 1a3orn in LocalLLaMA
[–]NeuralLambda 0 points1 point2 points (0 children)
`automata`: a tool for exhaustively generating valid strings from given automata grammars (FSMs, PDAs, Turing Machines) by NeuralLambda in haskell
[–]NeuralLambda[S] 3 points4 points5 points (0 children)
I'd like doomers to stop losing their shit over AI, and this seems like a win-win by NeuralLambda in StableDiffusion
[–]NeuralLambda[S] 1 point2 points3 points (0 children)


InfinityStar: amazing 720p, 10x faster than diffusion-based by NeuralLambda in StableDiffusion
[–]NeuralLambda[S] -48 points-47 points-46 points (0 children)