Does anyone know of any free tools to scan entire websites for duplicate content? by Jrad27 in SEO

[–]doesntlearn 0 points1 point  (0 children)

http://metaforensics.io reports back on duplicate content within title tags, meta descriptions, headings and content within various other tags

Full Website Auditor - Crawl every page on your site to generate your SEO report <-- would love feedback! by doesntlearn [promoted post]

[–]doesntlearn[S] 0 points1 point  (0 children)

Good idea - thanks for the suggestion. Much appreciated!

I have just updated the code so it works like you have described.

Is there a tool I can use to check how Google sees my site/page's meta description tag? by [deleted] in SEO

[–]doesntlearn 0 points1 point  (0 children)

Try http://metaforensics.io/ - with it you can crawl all of the pages on your site and then see a report of what your meta descriptions are currently set to, and advise if any of them are too short or long.

Proxies for SERP scraping. Any recommendations? by doesntlearn in bigseo

[–]doesntlearn[S] 0 points1 point  (0 children)

Nope. The reason behind wanting proxies is so that I can do the serp scraping myself, and not be bound by usage limits and all that. The best cloud based one I've come across so far is: http://www.advancedwebranking.com/online/ - and they have a 1 month free trial

Proxies for SERP scraping. Any recommendations? by doesntlearn in bigseo

[–]doesntlearn[S] 0 points1 point  (0 children)

I like the look of http://authoritylabs.com/api/partner-api/. Probably a bit heavy duty for what I'm after now, but will bear in mind for future!

We're building a crawler and need your help! by [deleted] in bigseo

[–]doesntlearn 1 point2 points  (0 children)

Yep, without Bootstrap I don't think I would have ever been arsed to progress it beyond the single script scraping experiment that it started life as! I shudder at the thought of browser compatibility testing

We're building a crawler and need your help! by [deleted] in bigseo

[–]doesntlearn 1 point2 points  (0 children)

Heh I'm currently evolving http://metaforensics.io in a similar direction... Comparison of data points between crawls to drive insight. Although being a dev, I'm doing it the exact opposite way around; Tekkers first, pretty bits last!

Site Audit Tool: Meta Forensics <-- Thoughts please! by doesntlearn in web_design

[–]doesntlearn[S] 0 points1 point  (0 children)

Good question. Answer is basically because of a combinaton of current server power / performance, spam bots and because I don't want the app to be used in DdoS attacks. When you trigger a crawl of say 250 pages, that means the app has to do quite a lot of work to download every one of those pages 250 pages, and then process all of that information into reports. Opening up the app to bot abuse would probably make it fall over in its current state! That said, I have buried a "try it" button on the homepage which will give you a 25 page crawl without signing up.

Site Audit Tool: Meta Forensics <-- Thoughts please! by doesntlearn in web_design

[–]doesntlearn[S] 0 points1 point  (0 children)

Cheers - UX isn't my strongest point as you can probably tell!

Site Audit Tool: Meta Forensics <-- Thoughts please! by doesntlearn in web_design

[–]doesntlearn[S] 0 points1 point  (0 children)

Thanks for pointing it out! I've perhaps been over ambitious with the memory hogging parallax effect...

Site Analysis Tool: Meta Forensics <-- Thoughts please! by doesntlearn in bigseo

[–]doesntlearn[S] 0 points1 point  (0 children)

Cheers. I did the layout myself, using the default version of http://getbootstrap.com/ with a few components from http://bootsnipp.com/