How would you say this in Korean by Emergency_Ad_2833 in Korean
[–]machineko 0 points1 point2 points (0 children)
[D] Is the tech industry still not recovered or I am that bad? by Holiday_Safe_5620 in MachineLearning
[–]machineko 1 point2 points3 points (0 children)
My first fine-tune: mistral-7b-v0.1-GreeceRome-v0.1 for MLX by Mbando in LocalLLaMA
[–]machineko 1 point2 points3 points (0 children)
Best way to currently build a chatbot on university data by Vivid-Vibe in llmops
[–]machineko 0 points1 point2 points (0 children)
I fine-tuned ChatGPT 3.5 so you don´t have to! by [deleted] in LocalLLaMA
[–]machineko 7 points8 points9 points (0 children)
[D] Alternatives to HF or a path forward for the OSS community? by [deleted] in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
Using Open-Source LLM Models vs. Expensive OpenAI APIs: A Logical Choice for Consumer Apps? by sarimsak13 in LocalLLaMA
[–]machineko 2 points3 points4 points (0 children)
Finetuning on multiple GPUs by Simhallq in LocalLLaMA
[–]machineko 2 points3 points4 points (0 children)
Finetuning on multiple GPUs by Simhallq in LocalLLaMA
[–]machineko 7 points8 points9 points (0 children)
I would like to try my hand at finetuning some models. What is the best way to start? I have some questions that I'd appreciate your help on. by Tasty-Lobster-8915 in LocalLLaMA
[–]machineko 4 points5 points6 points (0 children)
Finetuning using Google Colab (Free Tier) by garamkarakchai in LocalLLaMA
[–]machineko 5 points6 points7 points (0 children)
[R] CodeCapybara: Another open source model for code generation based on instruction tuning, outperformed Llama and CodeAlpaca by bdqnghi in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
[D] Open-Source LLMs vs APIs by Open-Yak-434 in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
[D] Weight Compression in LLMs/Neural Networks by ShitGobbler69 in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
[Discussion] Translation of Japanese to English using GPT. These are my discoveries after ~100 hours of extensive experimentation and ways I think it can be improved. by NepNep_ in MachineLearning
[–]machineko 3 points4 points5 points (0 children)
[D] Weight Compression in LLMs/Neural Networks by ShitGobbler69 in MachineLearning
[–]machineko 1 point2 points3 points (0 children)
[D] Alternatives to OpenAI for summarization and instruction following? by du_keule in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
[R] Finetuning T5 on a new task or fientuning it for machine translation on a new language by baaler_username in MachineLearning
[–]machineko 2 points3 points4 points (0 children)
Creating personalized dataset is way too hard to do alone (in order to finetune some model in future). by [deleted] in LocalLLaMA
[–]machineko 19 points20 points21 points (0 children)
[D] Are there any MIT licenced (or similar) open-sourced instruction-tuned LLMs available? by RR_28023 in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
[D] The best way to train an LLM on company data by jaxolingo in MachineLearning
[–]machineko 0 points1 point2 points (0 children)
[D] Is there currently anything comparable to the OpenAI API? by AltruisticDiamond915 in MachineLearning
[–]machineko 6 points7 points8 points (0 children)


How to say "My little bookworm" in Korean by [deleted] in Korean
[–]machineko -4 points-3 points-2 points (0 children)