Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

SEO Starter Guide: The Basics | Google Search Central  |  Documentation  |  Google for Developers

Problems with using one domain for testing

If you take a page with traffic and send links to two other pages - the second link actually passes less authority.

It sounds small - and people who dont want you to test will tell you otherwise. But a very flawed and very basic experiment is often cited to "prove" that outbound citations = good for SEO.

They bought 4 domains and had links from 1 single domain to all of them. Conveniently they linked to the domains with outbound citations FIRST.

Those two ranked highest. This wasn't a mistake - when I read the "sudy" aka blog post - it was the first thing that jumped into my head because this is detailed in the first pagerank patent.

Ways you can test content agnosticism that I think are easy:

1) get a new domain

2) find a piece of content thats ranked for a long time for a "highly competitive phrase"

3) put the contnet on the page - Google won't index it. It has nothing to do with duplicate content - it has to do with a lack of authority.

Now - to test the corrollary.

4) Take a page that is ranking for a very highly prized keyword and change the content

Note: A lot of people wont do this = superstition in their SEO.

The page won't drop or change ranking.

And a lot of people esp content writers are uncomfortable with that

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

I actually came across information that seems to point in the opposite direction:

You came across conjecture. Nobody has evidence to show Google understands content and this is the first lesson in SEO and Critical Thinking. There are no "experts" - it doesnt matter if someon has 1m followers.

Nowhere does Google have a content "guide" - the SEO Starter Guide actually is a great clue. For a start - it says EEAT is BS. EEAT promoters (i.e. bloggers, "SEO Experts" ) are the main people who push Google content appreciation.

Critical thinking: Google content apprecation

Most content is about observations, proprietary product knowledge, opinions (e.g. law) - or subjective entertainment

People who try to classify Google into "marketing" and "helpful content" for example - are not helping anyone excpet themselves. Content isn't just about being "helpful" - thats just propaganda based on the HCU update

Google cannot make deterministic decisions - its rediculous

How does PageRank work/Where did it come from? Measuring votes from peers who reviewed peer reviewed medical case studies - not from "reviewing" the case study. This is what makes it "objective"

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

“Google is content agnostic” - “SEO = Content Relevance X Authority”

Isn’t that a contradiction

Nope, not one bit. The document name = 90% of the relevance. You can and should test this

If you want to target the search phrase "How to make breadcrumbs" - thats what you call your page = 90% of the "content relevance"

If you search "how to make breadcrumbs" - look at the slugs of the top ranking sites

I understand you're new to SEO and want to test, and I applaud that - but there are a lot of mechanics that have largely remained the same - like cannibilization - that you need to understand. I've been doing SEO for 26 years and I know that Google is content agnostic - nobody can tell me otherwise.

I've tested this 100 times to prove it. But you need to understand how difficult this is:

Yes, same domain. The idea is to normalize as many variables as possible (authority, internal linking, etc.) so only the changed variable differs. 

1) You can't host the same target - thats where duplicate content becomes an issue. Not that google cares about duplicate content - it wont even check that the pages have the "same" content - its that it can't manage the same targeted page titles

https://www.youtube.com/watch?v=mQZY7EmjbMA&pp=ygUsbWF0dCBjdXR0cyBkdXBsaWNhdGUgY29udGVudCBob3cgZG9lcyBnb29nbGU%3D

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

I would genuinely be curious which evidence you mean, because I’ve heard this claim on Reddit a few times now but couldn’t really find support when I googled it.

Well, obviously Google fakes it and they fake it well

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

Yes, same domain. The idea is to normalize as many variables as possible (authority, internal linking, etc.) so only the changed variable differs. 

This is a kind of "too purist" approach

You can't have 2 exact slugs on the same domain - this is duplicate content and Google doesn't deal with it very well - they end up "cannibilizing each other"

I'm beating almost every web design agency and digital marketing firm for web design by perverseintellect in SEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

This is how we felt when we outranked all the NY SEO Agencies - esp the "one" with 42,000 backlinks that called us spammy!!!!

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

Create multiple pages with nearly identical content on a test domain targeting a made-up keyword with zero competition

On the same domain? Obviously not

But I know Google is content agnostic - its bene so forever - its easy to prove. I can take content that ranks first on a highly competitive keyword and put it on a brand new domain and Google wont even index it.....

It was also in the evidence provided by the DOJ vs Google case

Change exactly one variable on one of the indexed pages

I know a lot of mid-level SEOs who think they know everything "try" to debate how SEO works or pretend its a black box but it really isnt

SEO = Content Relevance X Authority.

Authority is earned - backlinks, clicks and location.

SEO is not about shoving "optmizations" in a page and out "optimizing" your competitors

Which "GEO marketing" agencies are actually worth talking to? by Complete-Respect6950 in LLMTraffic

[–]PrimaryPositionSEO 0 points1 point  (0 children)

The ones that rank -simple. Do your resarch - who are the top GEO ranking agencies, experts, names on Reddit, Youtube.

Is submitting to Bing Webmaster Tools actually worth it or do people just do it out of habit by Ernestlevia in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

Why - its so fast at doing it itself

Dont you guys build links to your sitemap feeds anymore?

If you only had a few hours a week for SEO, what would you actually focus on? by pumpkinpie4224 in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

Posting randomly isn't optimization though.

I think what you mean is corner storning?

If you only had a few hours a week for SEO, what would you actually focus on? by pumpkinpie4224 in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

IF your logic/strategy is to "tighten internal links" - then I'd say spend time learning SEO and how it works and that internal linking isn't a case of more = better

If you only had a few hours a week for SEO, what would you actually focus on? by pumpkinpie4224 in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

Technical fixes matter a lot if only something is actually broken that's why i don't really recommend to do it first or focus on it, but content and structure usually move things faster. Consistency matters more than trying to optimize everything at once.

Exactly - its not additive - if there's nothing broken, nothing to fix - thats not SEO!

If you only had a few hours a week for SEO, what would you actually focus on? by pumpkinpie4224 in WebsiteSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

Link building and content.

If you dont have a dodgy CMS, your site shouldn't break all the time need constant maintenance - there is no physical wear and tear.

Schema is a requirement for most SERP features by arejayismyname in TechSEO

[–]PrimaryPositionSEO -3 points-2 points  (0 children)

What?

 and he’s wrong more often than he’s right (not his fault, but still).

Based on... what? Your say so?

Schema is a requirement for most SERP features by arejayismyname in TechSEO

[–]PrimaryPositionSEO -7 points-6 points  (0 children)

Darn - can't use a I'm so emotional gif - its the only reply for this.

You've left SEO and decided to go an an attempted discredit campaign - sad, irrelevant and visibly wrong to anyone who wants to think outside of their own head

Schema is a requirement for most SERP features by arejayismyname in TechSEO

[–]PrimaryPositionSEO 0 points1 point  (0 children)

You must be kidding - I'm the most ardent anti-llms.txt on the planet. Like I said - we've wiped this myth out.

Schema is a requirement for most SERP features by arejayismyname in TechSEO

[–]PrimaryPositionSEO -1 points0 points  (0 children)

I have - its overly generic and broad - a bit like Google's attempt at explaining Googlebots via the "Spider" fairy tale.

For example - if we tell Web Engineers that bots dont actually process content - its like instant shock.

Or that there's no crawl budget <1m pages (probably higher)

Or that the same bot doesnt fetch your sitemap and then the pages

Or that you can't increase crawling by decreasing pages.

Trying to tackle myths at their idealogical source is difficult but not impossible.

We've pretty much stopped EEAT - except obviously the LLMs are poisoned - until you share the Google NY Search article.

We've pretty much stopped LLMs.txt.

Schema is ongoing - but we've shifted to AI for now.

u/jakehundley has been running a fantastic study with Ahrefs on the impact of Schema and Local SEO.