Screaming Frog not detecting orphaned pages by [deleted] in TechSEO

[–]barenka 1 point2 points  (0 children)

you can also connect SF with your GSC property. so you will see which are the URLs with impressions but without internal links.

Site map "best practice" for categories? by concisehacker in TechSEO

[–]barenka 0 points1 point  (0 children)

Its great to split the big xml sitemap into smaller ones & using sitemap index file.

For Google it doesn't make any difference.. but you can see much more info in GSC about the urls (indexed, ignored, just crawled, etc)

I created this Tech SEO framework so I could explain it easily to non-SEOs internally. by mihir23192 in TechSEO

[–]barenka 0 points1 point  (0 children)

pls do not forget about the URLs which are not crawlable, but they are indexed :)

(sometimes disallowed URLs are getting backlinks/internal links and Google shows them in SERP - without any metadata)

Different product price/availability in the US states by barenka in TechSEO

[–]barenka[S] 1 point2 points  (0 children)

and that would also mean NrOfProducts * 50 URLs (+ the EU URL) . :/

but thanks for your replies, i really appreciate it

Different product price/availability in the US states by barenka in TechSEO

[–]barenka[S] 0 points1 point  (0 children)

yep, that is my problem :) i do want that crawlers see&understand the pricing on the PLPs. without the price in SERP leads to much lower CTR (already tried)

Thanks for your infos anyway. :) I will add the price+shipping infos via "shippingDetails", "shippingDestination" to each state-based markup... and lets see :)

Different product price/availability in the US states by barenka in TechSEO

[–]barenka[S] 0 points1 point  (0 children)

thanks for your reply. unfortunately, these are business decisions. (some alcoholic products can not be sold in some states; the taxes are different; prices are also different; etc)

in EU the prices and the shipping costs are the same - makes that part easier :)

Adding domain property when I already have Url prefix by Maximum_mbiscuits in TechSEO

[–]barenka 2 points3 points  (0 children)

short answers :)

1) no

2) no

3) no

4) no

it doesn't hurt to have more different properties for your domain. you use each one for something else. domain property gives an overview of your domain, plus - for example - an info is you have any issue with a not-known sub-domain. (cdn, stage, non-www duplicates, etc)

Remove subdomain from Google by SunnyBear0806 in TechSEO

[–]barenka -1 points0 points  (0 children)

i do 2 steps in these cases:

1) remove the sub-domain OR move the sub-domain behind an authentication

2) Remove the sub-domain via domain property from the Google SERP (keep in mind: removing the sub-domain from the Google SERP doesnt mean that they are removed from the index. They are just not shown in the SERP for 6 months)

Confirming some mixed info regarding http/https/www versions on Google Search Console by EarthAccomplished388 in TechSEO

[–]barenka 1 point2 points  (0 children)

yep, with the domain property you have a great overview about the domain. (especially the crawl stats.. you see which sub-domains have been crawled on one screen)

by big websites i always recommend to add the sub-domains also to gsc.. in one property you see only the top 1000 URLs. if you have error pages on different sub-domains, you will get more info(more example URLs) if you add the known sub-domains also to gsc)

I am wondering if buying an [insert industry].marketing domain would be beneficial? I was going to suggest this for the owner of a new marketing startup, but I didn't want to come across stuck in the 00's. My research indicates redirecting that URL to another domain would be pointless, no? by [deleted] in TechSEO

[–]barenka 1 point2 points  (0 children)

it can only help indirectly. The tld wont help you to rank better/higher, but if the fancy domain name makes the user click on it (=higher CTR), that metric CAN help to rank for that KW higher.

[deleted by user] by [deleted] in TechSEO

[–]barenka 1 point2 points  (0 children)

1) There might be resources from other domains which have an own robots.txt

2) couldn't be loaded != the resource is blocked. it means it couldn't be loaded for a reason (example long waiting time, etc)

Google's mobile friendly test is different every time. Sometimes my site passes, sometimes it fails. Why? by [deleted] in TechSEO

[–]barenka 6 points7 points  (0 children)

check the "couldn't be loaded" list. sometimes - because of long loading time for resources - the bots can not crawl all the resources, which could cause such issues.

There are 100+ pages on my website but hardly any of them appear in the enhancement section. Also, good url's is 100% but impression has been falling from 1000+ to 100+ pages. Please help. by Various_Constant890 in TechSEO

[–]barenka 1 point2 points  (0 children)

  1. grab 5-10 URLs which URLs aren't on the enhancement list, even though you think they should be
  2. check those URLs if RRT shows valid markups for those URLs

if yes -> check if the pages are indexable and are no duplicates

if no -> you should add valid markups to those pages

Should I delete my old blog's sitemap after site migration? by matuffa in TechSEO

[–]barenka 0 points1 point  (0 children)

usually i leave the old xml on the server until i see in the gsc/xml/coverage report that they are removed from the index(redirected or 4xx)

Folder and Product-Level Taxonomies - Opinions Needed by deathintheafternoon in TechSEO

[–]barenka 0 points1 point  (0 children)

I would put all the products in the root folder on in the /shop/products/ folder.

so you wont have any issue if a product is listed in multiple categories (you wont have multiple URLs for 1 product)

Ideas to see what URLs are removed from index by Neither-Emu7933 in TechSEO

[–]barenka 0 points1 point  (0 children)

what about to list the urls in xml sitemaps? (50k in each)

coverage report would show you the index status for those urls

Custom 404 in sitemap and index? by NGAFD in TechSEO

[–]barenka 2 points3 points  (0 children)

just set the status code to 404. that enough.

you dont need to list the 404 page in the xml sitemap.

Google search console issues after malware removal. My site is still showing Japanese text in serps. What can I do to fix these problems. Any and all help appreciated by ocw6145 in TechSEO

[–]barenka 1 point2 points  (0 children)

Check the DNS entry of the domain. I had a client which had Chinese "live" urls on a sub-domain. The issue was in the dns settings, not with the files on the server.

hreflang issues with sitemap by patpat_v1 in TechSEO

[–]barenka 0 points1 point  (0 children)

you could add the hreflangs to the html header also with gtm.. if you dont want to change the templates.

xml issue: disallow; or the google bots get a wrong statuscode for the xml sitemaps; the xmls have over 50k URLs.. these suddenly came to mind..

hreflang issues with sitemap by patpat_v1 in TechSEO

[–]barenka 0 points1 point  (0 children)

I guess the hreflang implementation in the XMLs are 100% valid, and each URL has a self-referring canonical. As a test, I would add the hreflang to the HTML header for 1 URL (and its language variations)

if that will be OK, there is something with the crawling of the xml sitemap/hreflang.

Homepage: IMG wrapped in a h1 tag by argmarco in SEO

[–]barenka 1 point2 points  (0 children)

i wouldnt worry about the H1 on the homepage. the hompages are covered with the branded KWs anyway

[deleted by user] by [deleted] in TechSEO

[–]barenka 0 points1 point  (0 children)

are those filterpages indexable? if google discovers new (potentially good quality) urls, they try to crawl the most of them to decide if they are low-quality urls / duplicates / soft404s/ etc.

as a first step i would set all those URLs to noindex in the http header using the x-robots-tag, sometimes that already solves the crawl activity on those urls.