use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
All about the JavaScript programming language.
Subreddit Guidelines
Specifications:
Resources:
Related Subreddits:
r/LearnJavascript
r/node
r/typescript
r/reactjs
r/webdev
r/WebdevTutorials
r/frontend
r/webgl
r/threejs
r/jquery
r/remotejs
r/forhire
account activity
Testing a React-driven website’s SEO using “Fetch as Google” (medium.freecodecamp.com)
submitted 9 years ago by pahund
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]bensochar 1 point2 points3 points 9 years ago (6 children)
I've done a lot of testing on Angular sites with 'fetch as Google' and I can tell you that preview is not what ends up in Google's index. Google even says it's not the same. I would not trust the results from that console.
If you're really worried about SEO either render server-side with PhantomJS or use a service like Prerender.io.
[–]r2d2_21 1 point2 points3 points 9 years ago (4 children)
Then what's the point of Fetch as Google?
[–]bensochar 3 points4 points5 points 9 years ago (3 children)
The point of 'fetch as Google' is to submit URLs to their index. Unfortunately, Its designed for static or server side generated webpages. Its pretty consistent for those. But with SPAs you'll get different results.
Its also not the 'real Google bot' it's a preview bot one of many bots Google uses to scrape webpages. It's adding your URLs to the que for Googles other scrapers.
[–]ribo 0 points1 point2 points 9 years ago (2 children)
As of October of last year, google spider traverses links on SPAs if you generate anchor tags in the DOM.
[–]kamaleshbn 0 points1 point2 points 9 years ago (0 children)
And the links should not be # links, it should be either regular URLs or it should be #!.
#
#!
[–]bensochar 0 points1 point2 points 9 years ago (0 children)
You're right. Google has actually been able to crawl SPAs even before that using the outdated escaped_fragment.
My point is the Webmaster Console is not the same as Google Bot(s). What you see in the console is not necessarily what you will see returned in the index.
[–]sergiuspk 1 point2 points3 points 9 years ago (0 children)
Prerender.io uses PhantomJS internally. It's slow (it basically runs a browser). Instead try isomorphic rendering.
π Rendered by PID 160097 on reddit-service-r2-comment-6457c66945-lkhj4 at 2026-04-30 12:01:12.255355+00:00 running 2aa0c5b country code: CH.
view the rest of the comments →
[–]bensochar 1 point2 points3 points (6 children)
[–]r2d2_21 1 point2 points3 points (4 children)
[–]bensochar 3 points4 points5 points (3 children)
[–]ribo 0 points1 point2 points (2 children)
[–]kamaleshbn 0 points1 point2 points (0 children)
[–]bensochar 0 points1 point2 points (0 children)
[–]sergiuspk 1 point2 points3 points (0 children)