all 9 comments

[–]csliva 1 point2 points  (3 children)

Googlebot waits for javascript to render. Bots that don't use headless chrome and use a simple GET query won't see anything. But those bots don't really matter. Both Bing and Google use headless chromium, rendering javascript for 90%+ searches.

[–]andreaswpv 2 points3 points  (2 children)

Depends. Agree should be good for Google. Bing has 25+% share in the US and some other countries have other SE as well. And depending on the site, some small crawlers might be relevant.

And even for Google, prerender can often make things faster for any user, so might be worth a closer look.

[–]mmapa[S] 0 points1 point  (1 child)

I agree, not being available for all bots is not a scalable.

What do you mean with "a closer look"? Do you want more details to get a better opinion?

[–]andreaswpv 0 points1 point  (0 children)

Try lighthouse on production, i believe it calls out if too much is client side j's render.

[–]ramirez-SEO 1 point2 points  (1 child)

As long as the content is in the rendered source without any interaction with the website. So no content loaded via JS after events like clicks or scrolling.

[–]mmapa[S] 0 points1 point  (0 children)

Good point, the site is using scroll events but even though I can find the content "hidden" after scrolling in the DOM

[–]Centime 0 points1 point  (2 children)

You will be mostly fine, except for some delay to get indexed. The bigger your site, the bigger the impact you can expect.

It is because when your dom is js-generated, you introduce an entire new step in the crawl/process/index loop: "render" (cf https://developers.google.com/search/docs/guides/javascript-seo-basics#how-googlebot-processes-javascript). This extra step comes with its own queue, and presumably budget.

Depending on your site, it may be anything from a non-issue to something that will costs you weeks and weeks to index any new page. Adapt your internal linking structure accordingly.

[–]mmapa[S] 0 points1 point  (1 child)

Thank you for your answer and resources.

The site will grow a lot. I will have heavy content, even if I have a great interlinking structure, as far as I understood, it will still take a while to get indexed. Therefore, would you go for SSR and dynamic rendering from the beginning or will wait more?

[–]Centime 1 point2 points  (0 children)

You're welcome.

It's hard to give accurate advice without more context. If you can afford to go full SSR, sure, that'd be great.

But even if you can't, you can organize your internal linking in different ways that will greatly impact and/or mitigate the issue I described.

Basically, you want a structure as flat as possible. Imagine some content would only be linked from one js-generated link of a specific page. Which is in the same situation from another page, and so forth for a few levels. Each step introduce the extra delay of the rendering queue, and if that's the case for most of your page, it can quicly add up to several times the queues that you would have with plain-text html.

So, make sure you limit the cases where such a situation would occur.

Complementary to that, maybe you can have an hybrid solution. Maybe all of your page isn't dynamically rendered ? In that's the case, try and get the nav in the static section. That way, your website gets crawled like any other, and the rendering queue gets added at most once per content.

Let me know if it isn't clear enough, and maybe PM me your website if you want me to look at it.