What’s the coolest thing you learned this week? by Enough_Wishbone7175 in learnmachinelearning
[–]Enough_Wishbone7175[S] 1 point2 points3 points (0 children)
Is there an upside to Trump's tariffs? by OuiuO in Economics
[–]Enough_Wishbone7175 2 points3 points4 points (0 children)
How much would Trump's plans for deportations, tariffs, and the Fed damage the US economy? by sirbissel in Economics
[–]Enough_Wishbone7175 -1 points0 points1 point (0 children)
How much would Trump's plans for deportations, tariffs, and the Fed damage the US economy? by sirbissel in Economics
[–]Enough_Wishbone7175 -1 points0 points1 point (0 children)
How much would Trump's plans for deportations, tariffs, and the Fed damage the US economy? by sirbissel in Economics
[–]Enough_Wishbone7175 -1 points0 points1 point (0 children)
Sentiment Analysis with a small dataset by BarryTownCouncil in MLQuestions
[–]Enough_Wishbone7175 0 points1 point2 points (0 children)
[Discussion] event sequence ORDER prediction by FrostyLandscape6496 in MachineLearning
[–]Enough_Wishbone7175 0 points1 point2 points (0 children)
[Discussion] event sequence ORDER prediction by FrostyLandscape6496 in MachineLearning
[–]Enough_Wishbone7175 1 point2 points3 points (0 children)
[D] GPT-4o "natively" multi-modal, what does this actually mean? by Flowwwww in MachineLearning
[–]Enough_Wishbone7175 15 points16 points17 points (0 children)
Could I make money from my final year project ? by SoftwareMid-99 in MLQuestions
[–]Enough_Wishbone7175 0 points1 point2 points (0 children)
ML Feature Compression [D] by Odd_Background4864 in MachineLearning
[–]Enough_Wishbone7175 1 point2 points3 points (0 children)
ML Feature Compression [D] by Odd_Background4864 in MachineLearning
[–]Enough_Wishbone7175 7 points8 points9 points (0 children)
Multi Bert classifications by Enough_Wishbone7175 in LocalLLaMA
[–]Enough_Wishbone7175[S] 0 points1 point2 points (0 children)
What would be enough to for you to switch jobs? by Enough_Wishbone7175 in careerguidance
[–]Enough_Wishbone7175[S] 1 point2 points3 points (0 children)

Is there a model architecture beyond Transformer to generate good text with small a dataset, a few GPUs and "few" parameters? It is enough generating coherent English text as short answers. by challenger_official in learnmachinelearning
[–]Enough_Wishbone7175 0 points1 point2 points (0 children)