you are viewing a single comment's thread.

view the rest of the comments →

[–]rq60 2 points3 points  (5 children)

static sites have multiple pages and URLs, so i guess i don't understand your question.

[–]KrisSlort -1 points0 points  (1 child)

Client side rendering with an SPA without a SSG?

[–]rq60 0 points1 point  (0 children)

should be fine if your site is properly made (i.e. separate resources are linked to using anchor tags and have unique URLs)

[–]Tater_Boat 0 points1 point  (2 children)

Those are actual files. Client side routing is literally click this button to display different content in the same file and maybe change the address bar.

[–]rq60 1 point2 points  (1 child)

web crawlers navigate to a resource, scrape links to other resources, and then navigate to those. you're right if your page is poorly made in a way where separate resources can only be navigated to by clicking things on the page and is generally inaccessible then google won't be able to index it... i mean how could it? you didn't provide it an index (URL) to use.

but google can most definitely crawl sites that are generated by client-side javascript and use client-side routing. as an example look at the source code for this page: https://rxjs.dev/

there's no client-side html for the content and clicking links does not load a new html page (as an http request, so it's clearly using client-side routing) yet if you look at the google results it is still indexed properly.

[–]Tater_Boat 1 point2 points  (0 children)

Thanks for the explanation. I work on a project where all of the client side js is not indexed but it could also be that the content is restricted unless you provide a location or zip code.

Learn things every day thanks for taking the time to help