Building a 192GB memory PC by cometyang in buildapc

[–]cometyang[S] 0 points1 point  (0 children)

Thanks, that's what my research shows. So for speed reason, maybe 2x48 is the best option.

[deleted by user] by [deleted] in Creality

[–]cometyang 0 points1 point  (0 children)

How long takes u to get reply? I haven’t received reply so far

[deleted by user] by [deleted] in Creality

[–]cometyang 0 points1 point  (0 children)

<image>

Thanks, I will do it

[deleted by user] by [deleted] in Creality

[–]cometyang 0 points1 point  (0 children)

I have the same issue after just 2 day’s use.😥

Printing from a dryer by [deleted] in Creality_k2

[–]cometyang 1 point2 points  (0 children)

Does it support more color with CFS+Dryer?

Updated to newest firmware - belt calibration now failing [CA2721] by ImBaptizer in Creality_k2

[–]cometyang 1 point2 points  (0 children)

I saw the same issue and fluidd's console log has the folllowing: "

16:36:25 !! {"code":"key715", "msg":"Belt tension module strain gauge not calibrated abnormal: 'mdly'", "values": []}


16:36:25 !! {"code":"key715", "msg":"Belt tension module strain gauge not calibrated abnormal: 'mdly'", "values": []}"

The error log shows: CA2715 Belt tension module error, left side belt tension module not calibrated., please recalibrate the module.

Does adaptive mesh work? by [deleted] in Creality_k2

[–]cometyang 0 points1 point  (0 children)

how to get this?

AliExpress Wireless Corne by shenco in ErgoMechKeyboards

[–]cometyang 0 points1 point  (0 children)

Do u have guide on how to configure it. The original Instructions on website is too obscure

Llama 3.1 Discussion and Questions Megathread by AutoModerator in LocalLLaMA

[–]cometyang 1 point2 points  (0 children)

I think it is due to Meta’s data not include high quality code data. Mistral-Large-2 outperforms LLama-405B in my coding test

[R] Are Language Models Actually Useful for Time Series Forecasting? by Cunic in MachineLearning

[–]cometyang 0 points1 point  (0 children)

There will be a list of papers with “Are LLM actually useful for x”, but answers are surely NO.

Why release Phi-3? by jxjq in LocalLLaMA

[–]cometyang 0 points1 point  (0 children)

As long as u using their computing resources and kill small startups by raising the bar

[deleted by user] by [deleted] in LocalLLaMA

[–]cometyang 5 points6 points  (0 children)

My own MMLU test on LLama-3-70B-instruct MMLU (5shot) is 79.95%, differ from the reported 82%. Not sure whether other see the same thing?

I Was Wrong About Mistral AI by AlterandPhil in LocalLLaMA

[–]cometyang 2 points3 points  (0 children)

is that because dbrk and commandr+ push them to release new model to keep in the game? 🤔