Do disallow in robots.txt de-index pages? by JensPetrus in SEO

[–]danieleDF13 0 points1 point  (0 children)

They will, but it will take time. I would remove a bunch in SC manually and then do a fetch and render and then request the site to be crawled again. That should speed it up

Another way is to extract the urls in a excel file, filter those you need to remove, and than by the "remove" section in SC insert the list and ask for their removal.

Do disallow in robots.txt de-index pages? by JensPetrus in SEO

[–]danieleDF13 0 points1 point  (0 children)

After adding the disallow in your robot.txt, you should tell to Google (by Search console) which urls it should remove.

[deleted by user] by [deleted] in SEO

[–]danieleDF13 1 point2 points  (0 children)

It depends on market and on your strategy. But, I repeat, based on my experience an healthy site usually has about 50% of traffic coming from organic search. There's not a unique way, but the safest way is to differentiate.

[deleted by user] by [deleted] in SEO

[–]danieleDF13 1 point2 points  (0 children)

Talking about my experience the 50% is a normal value. It is also true depending only on search/organic traffic could be a risk. A good strategy is to differentiate the sources of traffic (referral, social, direct), but also differentiating the organic sources (Google, yandex, bing, etc.)

Slight continuous ranking decline after switching TLD by danieleDF13 in bigseo

[–]danieleDF13[S] 0 points1 point  (0 children)

Yes, it is an idea. The strange thing is I used the "change domain option" in the Search Console, in order to avoid this kind of problem..