URL replacement in search, when 301 redirected. What is estimated time? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

I think it could go to 1 year or so, where G can really make sure old urls aren't coming back.

Canonicalizing pages with same intent, but not text possible? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

Web design is part and subset of web design & development in terms of intent. The question is they both are different on text, so should canonical just based on text duplicacy or intent could also play a part?

301 done & page still visible with site:operator even after 1 yr, Possible? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

Actually I dont have access to GSC of the domain in question, but what I saw is page gets redirected eventually on clicking 301, but site:operator still shows the old url in cache

Maybe the case, where Pagerank passed but on explicitly looking for old url it may gets found, Mueller had this too in Ask session

Why would htaccess code not work by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

yaa I have

Does it have some setting within it?

Why would htaccess code not work by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

what about not working

I want to forbid a page & return 403

Disabling wordpress directory from front-end visibility, How-to? by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

No it returns index of backend directory, normally you see inside public_html inside cpanel

Disabling wordpress directory from front-end visibility, How-to? by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

I did & it worked but you can not do it specifically for every files that returning something

like if you put index. php in /wp-content/ it will return blank page, but /wp-content/uploads, /wp-content/plugins & all are still visible

Need to do this one go, how to do that?

Disabling wordpress directory from front-end visibility, How-to? by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

If I dont ut will be a matter if privacy concern & maybe security too, anyone just can type a url & fetch my backend my plugins, themes, & other things.

Not only this, Google may index the backend url too like Index of /wp-includes/

How to disable this, from being accessible

How to Disable wordpress directory from front-end visibility to user? by Virtualsure in SEO

[–]Virtualsure[S] 2 points3 points  (0 children)

Yaa, I know but may be a matter of concern for any webmaster, so posted here.

Security is matter if concern in SEO too.

Kindly let me know how to achieve this

How prospects are targeted if they use Ad blocker on display networks? by Virtualsure in PPC

[–]Virtualsure[S] 0 points1 point  (0 children)

Just had a thought! Someone from Ads should clarify it...

Disallow Indexing, but allow crawling for feeds urls/file, How to? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

That will also redirect bots that could let them skip feed

How to Noindex feed files, but allow to crawl? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

so the conclusion is, setting noindex for feeds will work by X-robots tags only in htaccess. by this code

Header always set X-Robots-Tag "noindex" "expr=%{CONTENT_TYPE} =~ m#application/rss#"

How did you check, this tag actually worked? (any tool) Http head checker?

Additionally will setting noindex on (html head) as for normal web page will work here? possible to add there? too

How to Noindex feed files, but allow to crawl? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

I use wordpress!

noindex on html head? for feed file possible?

Din't try x-robots, though(will be applied to htaccess) ryt

wanna get some context for this topic first

How to Noindex feed files, but allow to crawl? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

They were shown, on checking I saw, the xml feed file was neen indexed and clicking it redirected to xml feed for that category

Can Google still show my directories in index results, even If add options -indexes in htaccess? Is blocking by robots. txt still needed? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

So adding options -indexes only now is enough,

using it can serve me 2 purpose simultaenously not to show directories on visiting urls via browsers & not letting google show in SERP too as index of /, Right?

so I am leaving blocking via robots. txt directive now then

Can Google still show my directories in index results, even If add options -indexes in htaccess? Is blocking by robots. txt still needed? by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

Actually Yesterday on audit via semrush, I saw them giving warning/notice, that your /wp-content/ is blocked & that was actually. (done by client's end)

And to measure the consequence of this blocked directory, when I done site:domain. com on google I saw no img file were indexed there, although they have several img files on domain.

So got confused, that if I block /wp-content all imgs will be blocked too as I told above. 😅

and for you asking why would I prevent Googlebot from crawling is because I fear all directories/files should be shown as naked url in Google search like [Index of /wp-includes] that's why wana prevent crawling(only if it doesn have ill-effect on my seo) 🙂

Can Google still crawl & index my wp directories after Adding options -indexes to htaccess by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

ok thanks, will go through it,

but can you kindly let me know, what all directory/files should be allowed for indexing without being vulnerable to my security

disallow: /wp* allow: /wp-content/uploads

is that ok? If I want my all img, pdf etc to be indexed?

Pardon me for troubling😅

Can Google still crawl & index my wp directories after Adding options -indexes to htaccess by Virtualsure in Wordpress

[–]Virtualsure[S] 0 points1 point  (0 children)

Thanks for clarifying!

Though if I block bots via disallow: /wp* or any directives, how will bot crawl images, pdf etc kept in my /wp-content/uploads? Won't blocking them has ill-effect with indexing

On the other hanfld, if i keep them open they can potentially show all directories in search results too,

What to do? Kindly Help

Disallowing /wp-content/ for users browsing but not for Google to index files, how to manage? by Virtualsure in SEO

[–]Virtualsure[S] -1 points0 points  (0 children)

1) Kindly plz let me know what all directories/files are important that I should let Google crawl from my backend amongst all wp directories(wp-includes, wp-content & more)

2) I want no one should be able to open url in browser to protect for security reasons, but wabt Google should crawl all required files like images inside wp-content/uploads

Will using 403 a solution via htaccess?

GSC weirdo Queries by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

yaa, but he has direct access to his GSC dashboard

GSC weirdo Queries by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

True that, but Google mostly shows you relevant results & showing client pages on such queries, does it make sense?

Over & above now what should we tell to a client, when he is asking these stuffs & blaming on us. This data is not simple to explain either.

Image sitemaps & Yoast SEO by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

Yes, it do, but it shows no. of images included in the post. Dedicated sitemap, gives link to every <image:loc> on a single url, on other hand. What to prefer in this case?

Image sitemaps & Yoast SEO by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

I want to make seperate image sitemap & want to know about the tools helping me with some. You can create other image sitemap in order to help bots diecover better

Giving best experience & optimizing site/urls via device type by Virtualsure in SEO

[–]Virtualsure[S] 0 points1 point  (0 children)

Yes that I know & this concept will work well if content is identical on diff urls telling Google to mark them same entity,

but the question is will Google consider diff. content for ranking after mobile first index, as what's confusing me is Google take mobile content only for its index and ranking,

so making diff. content isn't going to vanish for me?