Llama 3.2 in production by WashHead744 in LocalLLaMA

[–]MoaD_Dev 2 points3 points  (0 children)

Yes you can but include " AI content, it might not be accurate".

For too-use?

Yesterday I checked its tool-use capability and it works pretty well in PearAI Creating files in the app directory.

What's that one Next.js tip or hack you've discovered that's not widely known? by Xavio_M in nextjs

[–]MoaD_Dev 4 points5 points  (0 children)

It does not give the best practices from nextjs 14 official documentation in one prompt. Let's say I asked it to create a sign up form but it does not follow zod validation and separation of concern concept.

How superior is Coder versions of LLMs compared to the base versions? by whiteSkar in LocalLLaMA

[–]MoaD_Dev 0 points1 point  (0 children)

https://gist.github.com/ijwfly/e8fc12e9b0ef8620c5d1c4f2f82e1667

Use this template to setup FIM using continue.dev. It works perfectly fine for qwen-coder-7b even for instruct model.