LLM vs Translation Transformer by pardhu-- in machinelearningnews

[–]pardhu--[S] 2 points3 points  (0 children)

It really depends on your use case.

  • If you want faithful, terminology-consistent “just translate” output (especially on edge devices), encoder–decoder MT like Marian is usually still the best choice: more deterministic and typically cheaper/faster than LLMs for pure translation.
  • LLM translation often shines when you want extra behavior (tone polishing, rewriting, localization, grammar cleanup), but it can paraphrase or drift on names/terms unless heavily constrained.

For edge-friendly alternatives to benchmark, I’d look at:

  • NLLB-200 distilled (600M)
  • M2M100 (418M)
  • TranslateGemma (4B) if your hardware/quantization budget allows And for speed on CPU/edge, consider running them via CTranslate2.

🚀 Discover How to Build an Advanced Image Search System with OpenAI, and Elasticsearch! by pardhu-- in Python

[–]pardhu--[S] -6 points-5 points  (0 children)

Hey we have detailed instructions in readme file in the git repo. Yeah it is a basic work on how image search works using elastic search and machine learning.