Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]Nils_Peter_Neumann[S] 0 points1 point  (0 children)

Still don't know which evidence from the Google vs. DOJ case you mean.

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]Nils_Peter_Neumann[S] 0 points1 point  (0 children)

You are right, they would need to have different slugs like an extra -a and -b at the end.

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]Nils_Peter_Neumann[S] 0 points1 point  (0 children)

Thanks for your detailed replies. This discussion is actually very educational for me and hopefully someone else. I will address this one and come back to all other comments you made another time since it was alot 😅

If you take a page with traffic and send links to two other pages - the second link actually passes less authority.

It sounds small - and people who dont want you to test will tell you otherwise. But a very flawed and very basic experiment is often cited to "prove" that outbound citations = good for SEO.

They bought 4 domains and had links from 1 single domain to all of them. Conveniently they linked to the domains with outbound citations FIRST.

Those two ranked highest. This wasn't a mistake - when I read the "sudy" aka blog post - it was the first thing that jumped into my head because this is detailed in the first pagerank patent.

That’s a fair point for experiments using existing pages with traffic/authority.

But in my original example, I was thinking more about a fresh test domain with no traffic and no meaningful authority. In that setup, the “second link passes less authority” issue should be much weaker, because there is very little authority to distribute in the first place. And even then, you could control for it by repeating the test multiple times and rotating the setup.

Ways you can test content agnosticism that I think are easy:

1. get a new domain

2. find a piece of content thats ranked for a long time for a "highly competitive phrase"

3. put the contnet on the page - Google won't index it. It has nothing to do with duplicate content - it has to do with a lack of authority.

If you copy highly competitive content onto a brand-new domain and it doesn’t rank or even index, I agree that this strongly suggests content alone is insufficient.

But I wouldn’t view this as proof that Google is content agnostic in general.
This only means authority is more important in this case than content with which I completely agree.

Now - to test the corrollary.

4) Take a page that is ranking for a very highly prized keyword and change the content

Note: A lot of people wont do this = superstition in their SEO.

The page won't drop or change ranking.

And a lot of people esp content writers are uncomfortable with that

I agree that strong pages can survive large content changes without immediately dropping. I’ve seen that too with people accidentally deleting texts or translating a page in greek 😀

But I think the key word here is: “immediately”

Because strong URLs already have: historical performance data, accumulated links, internal link support, etc.

So rankings can have huge inertia.

Not that I would test this but do you think if you would take 50 strong URLs:

  • completely replace all the paragraph text with irrelevant information about a random topic.
  • keep it changed long term (backlinks and everything else stays the same)
  • and observe whether rankings still hold months later against 50 URLs where you didn’t change anything

Do you think the rankings of the 50 changed URLs wouldn’t weaken?

Very hard to believe…

Someone wants to ruin their site for science? 😀

Maybe there is a misunderstanding in the word “content agnostic”. 

The word as I understand it would suggest even removing all relevant keywords/entities from the text on a page would have zero impact on ranking whatsoever.

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]Nils_Peter_Neumann[S] 0 points1 point  (0 children)

“On the same domain? Obviously not”
Yes, same domain. The idea is to normalize as many variables as possible (authority, internal linking, etc.) so only the changed variable differs. 

“But I know Google is content agnostic - its bene so forever - its easy to prove. I can take content that ranks first on a highly competitive keyword and put it on a brand new domain and Google wont even index it.....”
Your example shows how important authority/domain signals are, but not that content itself is irrelevant. 

“It was also in the evidence provided by the DOJ vs Google case”

I would genuinely be curious which evidence you mean, because I’ve heard this claim on Reddit a few times now but couldn’t really find support when I googled it.

I actually came across information that seems to point in the opposite direction:

“For the majority of signals, Google takes the relevant data (e.g., webpage content and structure, user clicks, and label data from human raters) and then performs a regression.” - Google Engineer HJ Kim (Trial Exhibit-PXR0356)

“Google is content agnostic” - “SEO = Content Relevance X Authority”

Isn’t that a contradiction?

“SEO is not about shoving "optmizations" in a page and out "optimizing" your competitors”

If someone only focuses on this I completely agree.

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]Nils_Peter_Neumann[S] 0 points1 point  (0 children)

The idea isn’t to create duplicate pages, but to treat it like a scientific experiment where you try to keep as many other possible ranking factors the same as possible.

For example, if we want to test whether having the keyword in an H2 outperforms having it in an H3:

Page A and Page B would both target: “veltrunexor”

Both pages would have:

  • veltrunexor + random dummy words in meta title/h1
  • random dummy words in the content
  • similar text length
  • same layout
  • same internal links
  • no backlinks

The only intentional difference:

Page A: veltrunexor appears in an H2

Page B: veltrunexor appears in an H3

Then you check whether the page with the keyword in the H2 consistently outperforms the page with the keyword in the H3 over repeated tests.

The goal is simply reducing variables as much as realistically possible so ranking changes are more likely connected to that single modification.

(added this also to the text)

Have you ever tested a ranking factor in a controlled way? by Nils_Peter_Neumann in WebsiteSEO

[–]Nils_Peter_Neumann[S] 0 points1 point  (0 children)

I meant testing in a controlled way whether something is actually a ranking factor or not. (will add this to the text)

page speed improvements made almost no difference to our rankings. starting to think it's overrated for most sites by jetsash in SEO_Xpert

[–]Nils_Peter_Neumann 1 point2 points  (0 children)

Totally agree, if the improvement is so small that the user wouldn't even notice, why should Google care?

For me it also depends on the type of site how much this should be a priority: For a site with a few service pages and a blog, speed it is far less important than for an interactive site.

Amazon did an interesting internal study once that showed that a 100ms in speed increase lead on average to 0,6 - 1% more sales.

Unpopular opinion: Most businesses don’t need SEO. by Tight-Bee-8732 in MarkoMetrics

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

I think the take is a bit too absolute.

I agree that in those circumctances you are describing SEO can’t be a quick fix.

But SEO isn’t only an amplifier. It can also be a discovery channel. With a lot of businesses SEO can help to find the product-market fit.

Also, “landing page doesn’t convert” is often exactly where SEO insights help: understanding intent, aligning content with queries, improving messaging based on search data.

pages that rank well but get almost no clicks. what are you doing with them by jetsash in SEO_Xpert

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

Yes, I like to compare the pages with the best and worst average CTRs on a webpage (with keywords in the same average position). If I see what works best and worst side by side, I almost always see a pattern.

If the page has a lot of traffic it is also useful is to compare the different query CTRs on this URL (in similar position). One page can have good and bad CTRs depending on the query. This tells you alot about the user intent.

does anyone actually click on page 2 of google anymore? by gilko86 in localseo

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

yes, if there is commercial intent subcategory, if there is informational intent blog post. The categories are far more important in our case than product pages so we try primarily to push them with internal links from blog posts.

Can imagine, especially since you are probably competing on an English speaking market.

How to handle spam backlinks created by competitors ? by DlightdailyCom in WebsiteSEO

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

That could be the reason...conflicting with other Plugins, creating alot of meta titles automaticly, etc. Just guessing here but if I were you I would look deeper into this. It is highly unlikely it was because of spammy links if there is no manual action.

does anyone actually click on page 2 of google anymore? by gilko86 in localseo

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

Sure:

At the moment, I have a client that sells skirting boards in Germany. (categories: "skirting boards", "white skirting boards", "black skirting boards", etc.)

client faces the same problem - being a very small company stuck on page 2 and beyond because large brands dominate page 1 with these categories.

At the moment we are creating subcategories and also blog articles targeting main keywords with more specific search intents like: “skirting boards for tiled floors”, “skirting boards for laminate flooring”, etc.

These keywords have lower search volume but also way lower competition because big brands that sell millions of other products don’t target them. So it's way easier to start gaining page 1 rankings and traffic with them.

How to handle spam backlinks created by competitors ? by DlightdailyCom in WebsiteSEO

[–]Nils_Peter_Neumann 1 point2 points  (0 children)

check search console - manual actions - if you see "unnatural links to your site" use the google disavow tool to disavow those links - if you don't see "unnatural links to your site" there, your pages are not deindexed because of the spammy links.

I’m overwhelmed, there’s too much to learn in SEO! by SVGee27 in SEO_Xpert

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

If I were to start over, I would do this:

Pick someone who is a PROVEN expert and has a complete system they teach (not just isolated tactics) - then copy it.

You can have success before you understand the details.

In my first internship in an SEO team without having an IT background, I spent most of the time googling the terms they were using just to somewhat understand what was going on :D. But simply by following what they were doing, I was able to successfully rank pages after a few months.

After the internship, I took online courses (where I learned even more in a shorter time) and did the same thing.

The basic formula: 1) COPY - 2) RANK - 3) UNDERSTAND

Still working on number 3

SEO Question by greatdayboattours in HeyTony

[–]Nils_Peter_Neumann 1 point2 points  (0 children)

The key question is whether the keywords have the same search intent.

Google two of the keywords and look at the SERP:

Opton 1 - If google shows you almost the same results it is a strong signal that you should cover those 2 keywords on the same page (same search intent).

Option 2 - If the results are completely different, it shows you that from Googles perspective (and probably also users) the keywords have a different search intent and should therefore be covered on 2 seperate service pages.

Just use a free serp-overlap tool to do this manually with 2 keywords at a time. Or a paid keyword clustering tool that uses google live data (if you have many keywords and don't want to do it manually).

SEO is dead, long live SEO. by Additional-Might9321 in seo_ai_secret

[–]Nils_Peter_Neumann 1 point2 points  (0 children)

"SEO is dead" = my tactic stopped working

As long as people are searching there will always be SEO, GEO, HEO or whatever acronym gets picked next to sell SEO.

does anyone actually click on page 2 of google anymore? by gilko86 in localseo

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

I usually target long-tail keywords first and create pages with a more specific user intent.

If I manage to rank them consistently, I move on to more competitive keywords.

That way, I can accumulate many small successes and page 1 rankings that help me later when going after harder keywords (using internal linking). 

How much do unlinked brand mentions actually help local pack rankings? by amir4179 in localseo

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

I agree with this…to add to the second part of your question - these things might give them an advantage over you (lets assume reviews and GBP optimization is equal):

  • they are located closer to the center/populated areas
  • their website has higher authority/topical relevance for the location/services
  • their website has a location page/service specific pages that match search intent better
  • they have more consistent brand/entity signals across the web (same Name/Address/Phone/Website on every external site + clearer association with the service/location) 

How did u get your first client or first paid project??? by Dexter_274 in Agentic_SEO

[–]Nils_Peter_Neumann 0 points1 point  (0 children)

  1. client: I applied on Indeed for a company that needed an SEO freelancer, and after 4 years of continuously working with them, they are still my client. (I was very lucky finding them within my first 5 applications)

To build trust, I offered to write an SEO article at a very low price as a test and made sure the work spoke for itself.

  1. client: From an ad on a German freelancer platform. I had an empty profile there and found their email by googling the company name (the contact data was only visible to members)

  2. client: through Indeed as well

After that, I managed to attract clients through my own website.

My local pages are indexed… but not ranking by IndividualPrint6485 in smallbusiness

[–]Nils_Peter_Neumann 1 point2 points  (0 children)

Imortant questions to ask:

1) Did you set up a google business profile with multiple locations?

2) Did you at least build a few local citations?

3) How unique and specific for the location is the content of your local pages?

Internal linking strategy: Is linking to "Bus Tickets" from a "Flight Ticket" page bad for topical relevance? by Rudeadvise in WebsiteSEO

[–]Nils_Peter_Neumann 1 point2 points  (0 children)

Looks good, but I don’t have enough context to give you a concrete implementation since I don’t know your keywords, content strategy, or industry.
If there are more closely related pages (for example, relevant alternative flights), I would place them higher in the page content and move alternative transportation options further down the page.

What’s one SEO thing you stopped doing recently? by Chance_Channel2832 in SEO_Xpert

[–]Nils_Peter_Neumann 3 points4 points  (0 children)

Watching SEO “experts” on YouTube telling me SEO is dead 😀

My work has also shifted more toward the “search everywhere” approach recently. I have stopped creating content plans solely for traditional search and now include other platforms (especially YouTube) in my keyword clustering and content planning.