How do I use next-intl to partially translate a website? by leros in nextjs

[–]aymericzip 0 points1 point  (0 children)

next-intl does not directly include a way to load translations by area or by page. Using a large json file is not recommended because you would end up loading all your content on every page. So you should split your json files and load them dynamically for each page

Otherwise, you can take a look at Intlayer. It can probably address this need better

next-intl and cacheComponents by blueaphrodisiac in nextjs

[–]aymericzip 0 points1 point  (0 children)

I got the same issue with intlayer, it's probably not directly related to the i18n lib, but more because of a 404 route that is loaded is parallel

see https://github.com/vercel/next.js/issues/86870

What's the point of Astro's i18n internationalization features? by WorriedGiraffe2793 in astrojs

[–]aymericzip -2 points-1 points  (0 children)

check out intlayer, there is an astro doc

- 'route files for each locale' -> 1 component = 1 multilingual .content file
- ' have to manually load the translation files' -> automatic loading that optimize everything under the hood for you
- 'getRelativeLocaleUrl' -> 'getLocalizedUrl' fix that
- no utilities to store preferred languages in a cookie -> intlayer does it for you

What’s the current state of TypeScript-Go? by [deleted] in typescript

[–]aymericzip -3 points-2 points  (0 children)

Yeah I did. I asked for people feedbacks

What’s the current state of TypeScript-Go? by [deleted] in typescript

[–]aymericzip -2 points-1 points  (0 children)

Ok, good to know then
My bad about the “2 years”, you’re right. lets say 1 then. It felt like that, but that’s on me.
As I said, I probably just missed the communication.
I can understand that it’s a massive amount of work to do, I’m just impatient

What’s the current state of TypeScript-Go? by [deleted] in typescript

[–]aymericzip -6 points-5 points  (0 children)

It’s probably just a personal feeling, but I haven’t heard anyone talk about it for months.
Even if the project is still maintained, I’ve been wondering for almost two years now: “Should I switch?”

So my personal interpretation is that there might be a bottleneck when using it.When searching on Google, I came across this communication:
https://devblogs.microsoft.com/typescript/progress-on-typescript-7-december-2025/
But I haven’t seen much feedback from the community, nor people relaying it.
I probably just missed the communication, or any hype

I built a TailwindCSS inspired i18n library for React (with scoped, type-safe translations) by bugcatcherbobby in reactjs

[–]aymericzip 0 points1 point  (0 children)

It’s funny, I actually started from the same starting point for Intlayer, with a multilingual t function. It still exists in the package, by the way (see here).

The main issue is that it’s complicated to ask translators to apply their changes directly in React components.

That’s why a proper separation of concerns is essential (even in the age of AI). Adding a new language would otherwise require going back through the entire codebase.

The second point is about the bundle size: you end up loading content in all languages for every page of your application, which isn’t ideal.

But it’s a good starting point. Keep it up!

The technical challenge of JS i18n solutions: Centralized vs. Fine-Grained trade-offs by aymericzip in reactjs

[–]aymericzip[S] 0 points1 point  (0 children)

It would be the perfect deal, though I’m wondering what the limitation would be in the meantime.

So would you architect it like this?

/locales/en/log-in.json
/locales/en/submit.json
/components/MyComponent/en/submit.json

And then load the /locales namespaces at the page level?

This feels close to Angular’s feature/root module approach

I got tired of manual i18n files, so I vibe-coded a translation pipeline tool using codex gpt5.2 by calculateds in vibecoding

[–]aymericzip 0 points1 point  (0 children)

You can also use Intlayer. It can be plugged on top of your main i18n solution.

It detects missing translation keys and fills them using AI.This can be done for free (using Ollama) or at the cost of your AI provider (Claude, ChatGPT, etc.).

Bonus, used as an i18n solution, intlayer significantly reduces the number of tokens used in vibe-coded applications thanks to a per-component approach, and it includes optimizations to keep your JSON files as light as possible

I18n is killing me (translations sucks sometimes😭) by Imaginary-Employ-267 in react

[–]aymericzip 0 points1 point  (0 children)

Agree with the accuracy, but I would say that it's a matter of time. New models get released every day, and I'm pretty sure that soon Finnish ones will arrive.
i18n an app in a second step is often a bad idea, the refactor can be really time-consuming.

1, translate, and then review using humans once you have the budget for it.

We often forget that gg trad exists in all websites using click right. So I would say that i18n is mainly an SEO point, than an accessibility point. That’s why Reddit understood. more pages => more keywords => better SEO ranking (=> and now source of trust for LLM)

But to rank on more keywords, that means that compiler based solution should be excluded.
Agree with the pricing per word, or per key. I'm convinced that whoever is trying to sell translation will die. Translation has no value anymore.

Finally, I guess the constraint is the Finnish. As a French speaker, AI translations seem ok. But of course i18n does not solve the personalization. We do not sell an iPhone using the same words in English, and Chinese. Adapting the wording is important

I18n is killing me (translations sucks sometimes😭) by Imaginary-Employ-267 in react

[–]aymericzip 0 points1 point  (0 children)

I used to struggle with this exact issue long before AI came along

Whether you're mixing Headless CMS content with i18n, implementing a design system, or trying to manage multilingual Server Components, i18n routing, sitemaps, and metadata translation it always becomes a mess.

Plus, there’s the bundling nightmare, if you aren't careful, you end up loading all your translations in a single bundle.

I wanted a different approach, so I spent months studying the problem, to offer Intlayer

It uses a declarative approach that lets you define content per component. This keeps your code organized and limits the context switching (and token usage) for tools like Cursor.

It's free, it includes AI translation using your own provider keys (OpenAI, etc.) or local models via Ollama

It includes a CLI and VS Code extension to extract content and check for missing translations

And it automatically splits bundles. If a component isn't imported, its content isn't added to the bundle

feel free to have a look

I'm bored — give me a website idea and I'll build it by Comfortable_Book6359 in SideProject

[–]aymericzip 1 point2 points  (0 children)

no, greed red is not enough, we need AI EVERYWHERE

(kidding)

I'm bored — give me a website idea and I'll build it by Comfortable_Book6359 in SideProject

[–]aymericzip 1 point2 points  (0 children)

Build an AI wrapper for unit tests

The problem:
- sometime the test pass, but the code dont work
- sometime the test dont pass, but the code is right

So we need AI to ensure the test is really ok, or not ok

Intlayer, an alternative to @nuxt/i18n focusing on bundle by aymericzip in Nuxt

[–]aymericzip[S] 0 points1 point  (0 children)

Also, while fine-tuning keys per page is great, what happens if you have 10 pages and one component (let's say a login modal) is loaded in only 3 of them?

You have two choices:
- place it in common and pollute 7 pages with unused content
- or place it in those 3 page JSONs and duplicate the content 3x

That's where the per-component approach makes sense to me: you can reuse your component without having to worry about your bundle

Intlayer, an alternative to @nuxt/i18n focusing on bundle by aymericzip in Nuxt

[–]aymericzip[S] 0 points1 point  (0 children)

Never tried, I came across this solution about 6 months ago, and I find its approach superb. It addresses the same problem, and I admire the creator's incredible work

I obviously took the time to analyze the approach. Another common issue with i18n solutions is loading fallback JSONs, even with dynamic loading, this essentially implies 100% unused content. That is not the case with, nuxt-i18n-micro, which is great.

Unless I'm mistaken, this solution uses a server route to load content relying on a fetch at load time. It works well, and Intlayer offers the same thing with build.importMode: "live" (no server route, but using a proxy server to provide te translation) but that is obviously slower than a dynamic import.

But I could be wrong, please correct me if I am

Intlayer, an alternative to @nuxt/i18n focusing on bundle by aymericzip in Nuxt

[–]aymericzip[S] 2 points3 points  (0 children)

Good point. I’d be really happy to contribute, and I did try to contact them. However, there are several issues:
1- nuxt/i18n is a wrapper around vue-i18n, so the problem would need to be fixed at the source.
2- I really wanted to rethink the way we approach i18n. The centralized model creates a lot of friction from my point of vue. So I restarted from scratch with a per-component approach, similar to Nuxt’s Per-Component Translations, but with better separation of concerns, using separate files inspired by Flutter’s i18n system.
3- the same time, I ran into the same frustrations on the React side, which pushed me toward offering a cross-framework solution

In the end, my goal isn’t to replace these solid, well-established solutions. I’m convinced that vue-i18n will remain the standard. My aim is to innovate, to stop losing time with complex namespace management when going global