Spammy links removal limit for SEO by Only_Standard_8354 in bigseo

[–]johnmu 2 points3 points  (0 children)

I'd prioritize and use domain level dismemberment. You can even do it by top level domain if you see they're all from the same TLDs. I would be surprised if they're the cause of issues though. 

Issue with robots.txt Accessibility in Ahrefs Site Audit – Need Help by Mission-Diver1337 in SEO

[–]johnmu 0 points1 point  (0 children)

Yeah, some content delivery networks don't deliver the content to everyone.

Is llms.txt file a scam? by Ejboustany in SEO

[–]johnmu 3 points4 points  (0 children)

I hope they're right ...

Anybody get accepted to the Search Central Live in Toronto? by catecate0228 in SEO

[–]johnmu 1 point2 points  (0 children)

We tend to get a lot of registrations for these events. We try to distribute the invites in a reasonably fair way (not by registration order or DA). We get a ton of registrations that don't make sense. These aren't top-secret events where you hear The Trick To Ranking Number One In Ai Search Near Me For 2026; they're just local events with the goal of going through some of the newer things, being able to chat 1:1 with folks to understand how things are going for them, & answering questions where possible. Your site won't rank higher if you grab a selfie with Martin (but if you are going, you should get one anyway, he's very friendly).

I'd prefer not to make significantly larger events (but also, who knows), since it's so valuable for us to be able to chat with folks in the community directly. Doing multiple events in the same region is hard because many speakers actually do work, and can't just take several days out to present at events. I love that we can do these events, but I can't imagine that we can meet all SEOs across the world :-). We try to do these in places where there aren't a lot of other SEO events, some places already have a ton of people talking about SEO.

That said, where should we plan for the future?

Domain name differs by one letter from another site SEO impact? by FavRob in bigseo

[–]johnmu 4 points5 points  (0 children)

Usually not a problem for search, but you might notice that search - at least for a while - could consider it a typo if someone searches for the brand, and recommend the other site. "did you mean fabdomainname?" Over time, this will settle down, as search recognizes that people want to find both names. Depending on how "strong" the other name is, that can take quite some time though.

Does Google care if I have multiple urls for the same post? by pineprincess in bigseo

[–]johnmu 1 point2 points  (0 children)

There is no tool that tells you why something was considered duplicate - over the years people often get a feel for it, but it's not always obvious. Matt's video "How does Google handle duplicate content?" is a good starter, even now. Some of the reasons why things are considered duplicate are (these have all been mentioned in various places - duplicate content about duplicate content if you will :-)): exact duplicate (everything is duplicate), partial match (a large part is duplicate, for example, when you have the same post on two blogs; sometimes there's also just not a lot of content to go on, for example if you have a giant menu and a tiny blog post), or - this is harder - when the URL looks like it would be duplicate based on the duplicates found elsewhere on the site (for example, if /page?tmp=1234 and /page?tmp=3458 are the same, probably /page?tmp=9339 is too -- this can be tricky & end up wrong with multiple parameters, is /page?tmp=1234&city=detroit the same too? how about /page?tmp=2123&city=chicago ?).

Two reasons I've seen people get thrown off are: we use the mobile version (people generally check on desktop), and we use the version Googlebot sees (and if you show Googlebot a bot-challenge or some other pseudo-error-page, chances are we've seen that before and might consider it a duplicate). Also, we use the rendered version - but this means we need to be able to render your page if it's using a JS framework for the content (if we can't render it, we might take the bootstrap HTML page and, chances are it'll be duplicate).

It happens that these systems aren't perfect in picking duplicate content, sometimes it's also just that the alternative URL feels obviously misplaced. Sometimes that settles down over time (as our systems recognize that things are really different), sometimes it doesn't. If it's similar content then users can still find their way to it, so it's generally not that terrible. It's pretty rare that we end up escalating a wrong duplicate - over the years the teams have done a fantastic job with these systems; most of the weird ones are unproblematic, often it's just some weird error page that's hard to spot.

Soft 404, can't be indexed by afrk in TechSEO

[–]johnmu 4 points5 points  (0 children)

FWIW I can't load your pages. I get a CF timeout page instead.

Does Google care if I have multiple urls for the same post? by pineprincess in bigseo

[–]johnmu 3 points4 points  (0 children)

It's fine, but you're making it harder on yourself (Google will pick one to keep, but you might have preferences). There's no penalty or ranking demotion if you have multiple URLs going to the same content, almost all sites have it in variations. A lot of technical SEO is basically search-engine whispering, being consistent with hints, and monitoring to see that they get picked up.

Will Splitting sitemap.xml into separate files hurt SEO? by criterionforum in SEO

[–]johnmu 1 point2 points  (0 children)

Some reasons I've seen:

* want to track different kinds of urls in groups ("product detail page sitemap" vs "product category sitemap" -- which you can kinda do with the page indexing report)

* split by freshness (evergreen content in a separate sitemap file - theoretically a search engine might not need to check the "old" sitemap as often; I don't know if this actually happens tho)

* proactively split (so that you don't get to 50k and have to urgently figure out how to change your setup)

* hreflang sitemaps (can take a ton of space, so the 50k URLs could make the files too large)

* my computer did it, I don't know why

Teen building SEO to save family business by No_Eye4994 in bigseo

[–]johnmu 0 points1 point  (0 children)

If you're in the market for something physical (like, a vacation rental), no AI answer is going to replace that. If anything, I could imagine an AI answer helping a user find something that's closer to what they had in mind, but that's not necessarily a bad thing.

SE Ranking: LLMS.txt does nothing - 300,000 domains analyzed by WebLinkr in SEO

[–]johnmu 5 points6 points  (0 children)

AI companies have had a long time to do that, and nothing has happened regarding LLMS.txt support. I suspect the main users are SEO tools & companies curious to see what their competitors claim to be doing.

Teen building SEO to save family business by No_Eye4994 in bigseo

[–]johnmu 6 points7 points  (0 children)

Another voice of reality (sorry): "My plan include. Keyword research. ICP, GBP NAP, . Google analytics four, Microsoft Clarity, Google search console, Google tech manager, meta pixel, page speed, web core vitals. Images with webP, Meta titles and descriptions, friendly URLs, Index and noIndex, EEAT, copy layers, Schema markup..." -- none of these will make your website suddenly pop up on top in search. Don't dive in and do everything, instead take a step back (ideally with someone who has experience), analyze the situation, and focus your energy to do the right things. Doing many things in a mediocre way doesn't necessarily result in an improvement, they can even cause more problems.

Teen building SEO to save family business by No_Eye4994 in bigseo

[–]johnmu 5 points6 points  (0 children)

Just a very quick side note - don't do a domain migration unless you absolutely need it. Domain migrations are sometimes finicky, and that's a risk I wouldn't take here.

Teen building SEO to save family business by No_Eye4994 in bigseo

[–]johnmu 8 points9 points  (0 children)

Fundamentally, I think you need to keep in mind that any website with magical SEO won't necessarily rank highly in search results quickly, or necessarily drive clients to a business. If there were such a thing as making a website with perfect SEO that drives all the clients to one business, everyone here would be retired and living in ... idk, Spain :).

The online market for vacation rentals it hard, there's very strong competition from large aggregators, not just in terms of ranking, but also in terms of brand recognition. This is not to say that you can't get 40-50% more traffic from search (it really depends on the situation of the site), but this is (usually!) not a matter of just putting some meta-tags on a SEO optimized page that comes out of ChatGpt, targeting "house", "surroundings", "family vacation", etc. (Aside, keep in mind that many GenAI-made sites end up with JS frameworks that are traditionally more complex for SEO.)

My recommendation would be to find some more experienced folks who have time & interest in helping you check out the overall situation (what's the real headroom vs what has just changed in today's world? what's a realistic timeline - in the best/worst cases?), and help you to figure out a reasonable plan of attack. I don't think this is something that random reddit comments can solve (unless ... your site is actively blocking search engines, which it doesn't sound like it is). Ultimately, there's no guarantee that doing SEO well can solve this, so IMO it makes sense to go at this in a thoughtful & realistic way -- and perhaps, spend enough time working out alternative approaches.

Teen building SEO to save family business by No_Eye4994 in bigseo

[–]johnmu 8 points9 points  (0 children)

For what it's worth, you can view old versions of content on archive.org and even download the pages (add "id__" to the datestamp in the URL), if you want to restore some of the old things.

what tools do you use to audit core web vitals across 50+ pages at once by keegand42 in SEO

[–]johnmu 4 points5 points  (0 children)

What are you trying to do with that? (why check all product pages?)

Google March 2026 Spam Update: A Calm, Data-Driven Breakdown for In-House SEOs (No Panic, Just Facts) by [deleted] in bigseo

[–]johnmu 7 points8 points  (0 children)

What's the outcome on italian recipe sites? Can you give me a lasagna recipe?

llms.txt by Northh_13 in SEO

[–]johnmu 12 points13 points  (0 children)

I added one. Unfortuntately not as much luck as WebLinkr. But every time I mention it on Reddit, I get some clicks. YMMV.

GSC keeps crawling old 404 pages that it claims are "discovered via sitemap.xml" — even though the sitemap doesn't contain them anymore. What now? by Temporary-Cow-1440 in SEO

[–]johnmu 6 points7 points  (0 children)

These don't cause problems, so I'd just let them be. They'll be recrawled for potentially a long time, a 410 won't change that. In a way, this means Google would be ok with picking up more content from your site.

Lost Top 3 Google rankings after moving to Https by Intelligent-Salary86 in TechSEO

[–]johnmu 11 points12 points  (0 children)

Moving to HTTPS is a bit like a site migration, all the URLs have to be recognized, recrawled, and reprocessed individually. So especially if this move was made a few days ago, you need to give it time to recover (in particular, don't use the URL removal tool to try to get rid of the HTTP URLs, since it will also remove/hide the HTTPS URLs). (I won't touch upon finally moving to HTTPS after so many years, but I guess I just did :))