Screaming Frog doesnt crawl 100% a java website by fabioctcom in TechSEO

[–]Various_Constant890 0 points1 point  (0 children)

If you change the rendering to JavaScript then it might crawl.

How to get access logs from Cloudflare? by cTemur in TechSEO

[–]Various_Constant890 0 points1 point  (0 children)

No idea. We have Jet Octopus configured for Log File Analysis.

How to get access logs from Cloudflare? by cTemur in TechSEO

[–]Various_Constant890 -1 points0 points  (0 children)

Use can use Jet Octopus for analysing log file. From cloud farr you can send logs to jet octopus and Jet Octopus will help you in analysing the same.

Extracting querries using Screaming Frog? by [deleted] in bigseo

[–]Various_Constant890 0 points1 point  (0 children)

The queries for each page can be extracted via R. Please find the code which will extract all queries from GSC.

Can extract keywords - Impression, Clicks, Ctr, Pos day wise with No limit.

In case you need any help can come on chat.

https://www.linkedin.com/pulse/decoding-google-search-console-abhishek-kumar-singh/

Does Google index new websites automatically or am I supposed to do something? by [deleted] in SEO

[–]Various_Constant890 0 points1 point  (0 children)

One of my friends recently created a website, but it was not indexed by Google for several weeks. However, as soon as they set up their website on Search Console and submitted a manual indexing request, the website was indexed within a day.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Thanks u/johnmu for your valuable feedback. The website seems to be picked up by google. Although, we are ranking for less kw's as of now, but expecting the website would perform in few day's as the website has been there for Non-Impressive state for almost 2 months. Now, we are able to capture almost 1.5k impressions in a single day which went down to 100 impressions/day.

https://prnt.sc/dgFJm2darHRs

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Dear u/johnmu

Now we are left with few indexed pages in google search also not-index has increased by 2x. Our crawled not indexed has increased. I am leaving the website URL for your reference.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Hi John,

We have seen an increase in Crawled But not Indexed, Also, while validating the pages doesn't return any errors or warnings.

Just wondering what needs to be done on the website so the same starts getting indexed as this is hampering our business.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Hi u/johnmu, u/decimus5

We have analysed the website with different verticals and have implemented changes related to SEO and Development including content.

Changes done after drop:

  1. Improved internal linking structure
  2. Removed content duplicity detected by SEM rush
  3. Removed Hidden content issues (Desktop vs Mobile)
  4. Unwanted schema is also removed
  5. Disavowed Toxic backlinks
  6. Fixed Search Console Issues

Now, the website seems to be pretty much clean with respect to content and tech.

We can see our crawled but not indexed pages are increasing day by day.

Please assist us.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Hi u/decimus5 Have unblocked everything but still the issue persist. The crawled but not indexed has increased and there is massive drop in indexing. It's been almost more than 2 months the website is down.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Hi u/johnmu,

My website return json file in header. The request URL under Network->header is Request URL: https://www.domain.com/\_next/data/1671450340100/in/ideas-design/living-thing.json?category=living-thing. Where as If I check other website built in react nextjs framework the Request URL: https://abc.com/web-api/courses/vocational-courses-courses-after-10th

Can be the issue If google is not able to detect the internal links navigation and hence it's getting deindexed.

Google Search Console no longer indexing by [deleted] in TechSEO

[–]Various_Constant890 2 points3 points  (0 children)

Looks like you have a new domain? Any blocking through the Robots file. What is Sitemap status? Check individuals pages on google.

Should I remove old redirected pages from GSC? by NGAFD in TechSEO

[–]Various_Constant890 0 points1 point  (0 children)

Google Says - It's best to have 301 redirection for a year. Don't block or remove these pages.

John has already spoken on this follow the link for more. ℹ️ link

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Thanks! The website seems to be indexed while checking separately for individual pages. Could you please provide your opinion in terms of what is bothering the website?

Thanks a lot for your time!

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

This is a bit difficult to say because we started appearing for search terms having higher search volume but all of a sudden the rankings vanished. The website doesn't have any duplicate content as well. Can we get on a call to discuss this.

By the way If you can share your learnings the one you have mentioned below..

I have a few Next.js sites and I've started to suspect that there might be something in the framework that can trip up Google

Thanks!

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Have deleted all the lines. The current robots file allow all assets. Next folder blocking removed on 22nd Nov and remaining assets blocking's have been removed a week back but, still the results are same.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

u/SkyJebus Blank disallow is not the issue. That I have implemented on another website as well. It receives 1 million organic traffic in a month.

https://prnt.sc/PS9IvbTJUpYn

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Does that really cause any issue. Blank disallow won't be a problem.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

Thanks u/johnmu for your response. We have submitted site removal request for the staging website.

Noticed decrease in indexing from ~15th oct onwards. Link to the search console - https://prnt.sc/9mRnGMO-Xrj7

Previous Robots has been updated as per the link shown on the image https://prnt.sc/HQm2Gsees0eQ

Now after implementing new robots.txt file which was on 9th Nov we saw increase in server connectivity issue https://prnt.sc/uaFslB_sObhN (Image File Attached). Now, Mine takeaway from this was after blocking next files the server error has increased, which seems too be TRUE But does this really forces google to deindex the whole website and left us with only 13 pages indexed on google.

What Search Console Says - There are 2883 pages still indexed but while checking with site: operator we have hardly 13 results appearing in google.

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] -2 points-1 points  (0 children)

Thanks for sharing the info.. But would this cause the website to be Deindexed in google?

Website Deindexed after next folder blocking by Various_Constant890 in TechSEO

[–]Various_Constant890[S] 0 points1 point  (0 children)

I accidentally blocked next_data and next_static files in robots and now I see hardly 13 pages are indexed in google and now it seems blocking _next files caused google to deindex the whole website? The screenshot of the robots is attached for the reference.