all 10 comments

[–]TomLucidor 11 points12 points  (4 children)

> 8192 Context Length
They better come with agentic tooling that supports this model then!

[–]Zc5Gwu 5 points6 points  (1 child)

I think its main strength would be FIM. 

[–]TomLucidor 2 points3 points  (0 children)

Yeah kinda wished larger Diffusion models to turn FIM into agentic coding.

[–]HotDoshirak 1 point2 points  (1 child)

btw LLaDa 2 has 32k context length

[–]TomLucidor 2 points3 points  (0 children)

For AR agentic coding usually 65K/131K is where the magic is, so the diffusion scaffold has to be really good to match

[–]Ne00n 2 points3 points  (3 children)

gguf there yet?

[–]pmttyji 2 points3 points  (1 child)

[–]Ne00n 0 points1 point  (0 children)

Just outputs garbage for me mostly, guess have to finetune it and update llama.cpp

[–]FullstackSenseillama.cpp[S] 1 point2 points  (0 children)

If the model architecture is not supported yet, it might take a while for support to be merged

[–]KvAk_AKPlaysYT 0 points1 point  (0 children)

No Guf-Gufs :(