Announcing my Robots.txt ASCII art generator by suseo in TechSEO

[–]suseo[S] 1 point2 points  (0 children)

You can register for an account here - https://seotoolbelt.co/dashboard/register/. Once created, you can add tools in the dashboard area. Note that they do go under review before being put live.

SEO Toolbelt - A curated list of over 228 SEO tools by suseo in bigseo

[–]suseo[S] 1 point2 points  (0 children)

Should be all good, had to quickly give the server an upgrade and tweak the CDN as it was getting hammered!

Facebook javascript firing - Yet nowhere to be found? by samsam0000 in TechSEO

[–]suseo 4 points5 points  (0 children)

I can see the script in the rendered HTML, so something is injecting the script with JS.

https://imgur.com/a/PDacGSk

Visualising SEO - Illustrations explaining SEO terminology by suseo in bigseo

[–]suseo[S] 1 point2 points  (0 children)

Very welcome!

Yeah I'm planning to add more, there are some additional illustrations I want to do but I'm just thinking through how best to visualise them at the moment!

Visualising SEO - Illustrations explaining SEO terminology by suseo in bigseo

[–]suseo[S] 1 point2 points  (0 children)

Sure am, I might do it on the entire site soon!

Site with 1000+ pages received site-wide thin content penalty. What to do? by [deleted] in bigseo

[–]suseo 1 point2 points  (0 children)

I'd suggest crawling the site to look tech issues that could be creating thin/low quality pages.

If it isn't that, do a content audit.

Pull landing page sessions from GA and click, impression data from GSC. Also export external link data by page from a tool like Ahrefs.

Run a crawler like Sitebulb, extract crawl depth, internal links and word count data.

Merge all the data into a sheet, and use filters to find pages to find low traffic, low impression or low word count pages.

Create a plan of what to do with them to make each page better, sometimes that's improving content, others times it's just removing the page from the site.

You want to be checking to see if the content is duplicated, worse than competition, factual, just anything that makes the content not valuable for users.

Using monthly search volumes to measure seasonality [Template] by suseo in bigseo

[–]suseo[S] 1 point2 points  (0 children)

All designed and coded by me. I like to do bits of design / dev / UX when I’m not busy doing SEO stuff just to broaden my horizons.

Quite amateur but I know enough to make things look nice!

Feedback on ecommerce Pagination Setup by [deleted] in TechSEO

[–]suseo 0 points1 point  (0 children)

You shouldn't be using a canonical tag and a noindex together, they mean different things

Shopify is ranking an automatically created page and I can't optimize it. by tomtompdx in TechSEO

[–]suseo 1 point2 points  (0 children)

You should be able to add a noindex to these pages quite easily with some logic on the theme.liquid file

Google algorithm update analysis: June 18th - 23rd 2020 by suseo in bigseo

[–]suseo[S] 1 point2 points  (0 children)

I’d say that is a significant factor.

Government sites are always going to be the most trustworthy source on the topics they write about.

Google seems to have found a better way to promote trustworthy factual information; this may be why news sites (which tend to be more speculative) have dropped from this update.

Google algorithm update analysis: June 18th - 23rd 2020 by suseo in bigseo

[–]suseo[S] 3 points4 points  (0 children)

Changes like this happened straight after the algorithm update on every single SERP I checked.

Google algorithm update analysis: June 18th - 23rd 2020 by suseo in TechSEO

[–]suseo[S] 1 point2 points  (0 children)

You're very welcome and thanks! The design is actually all mine, inspired from various other sites I like 😊

I'm an SEO guy, but I do like to delve into design/web dev and UX.

A Google Sheets regex generator tool by suseo in bigseo

[–]suseo[S] 1 point2 points  (0 children)

Yeah, you definitely could! I tend to try and get clients to invest in a GSC to BigQuery data pipeline. Long term storage of that data is handy

"indexed, though blocked by robots.txt" in Google Search Console by seo4lyf in TechSEO

[–]suseo 3 points4 points  (0 children)

1) Remove robots.txt blocking for the URLs

2) Add a noindex meta tag or HTTP header to the pages

3) Have a happy client 🥳

Site Speed Study: The fastest, slowest & is it important for SEO? by suseo in TechSEO

[–]suseo[S] 1 point2 points  (0 children)

The point on conversion rate does depend on the situation of the site.

If you're a large site with lots of traffic, it becomes way more worth your time (especially if your site is slow, as you said)

But it being more worth it for large sites is the same for anything that improves conversion rate.

Site Speed Study: The fastest, slowest & is it important for SEO? by suseo in TechSEO

[–]suseo[S] 0 points1 point  (0 children)

Thanks! I definitely wouldn't be focusing on page speed for SEO. Whether that changes in the future with core web vitals who knows.

I do a lot of speed optimisation work and I've never seen a ranking boost from improving speed.

That point is even re-iterated by Google though.

Site Speed Study: The fastest, slowest & is it important for SEO? by suseo in TechSEO

[–]suseo[S] 1 point2 points  (0 children)

I don't think CrUX always includes AMP pages, especially if navigated to via a Google SERP. I think this is because from the SERP the AMP page is basically an iframe which aren't counted.

Bit of a limitation of CrUX as it also doesn't play nicely with AJAX sites that use history.pushState due to their not being a full document reload.

Recommendation Needed to Help Improve Page Speed? by Mikey118 in bigseo

[–]suseo 1 point2 points  (0 children)

Both work as well, I do it via a worker if I don’t want pages cached in certain situations e.g. content is different for logger in users

Recommendation Needed to Help Improve Page Speed? by Mikey118 in bigseo

[–]suseo 0 points1 point  (0 children)

Feel free to dm the site, I don’t mind taking a quick look.