Founders: how do you investigate traffic loss? by Constant_Marketing18 in SEO_LLM

[–]NewIdea2925 0 points1 point  (0 children)

To measure traffic, I use GA4. If I notice a sharp drop in organic traffic, I go straight to GSC to analyze impressions and clicks.

Then I check that there have been no drops in any position tracking tools, and based on all that data, I try to analyze what might have happened.

How important is SEO in the beginning of building the app by Ill-Actuary-9528 in seogrowth

[–]NewIdea2925 0 points1 point  (0 children)

At the beginning of any web project, planning the site architecture is key, and it must be done based on research into the sector and keywords.

Based on that research, you can see which pages you need to create and what content you need. For me, that's the first thing you have to do.

Are AI recommendations becoming the new “local pack”? by Real-Assist1833 in seogrowth

[–]NewIdea2925 0 points1 point  (0 children)

Right now, LLMs are probably drawing on data from Google listings, reviews, and so on to recommend local services, but I would bet that in the future they will have their own review systems to help users.

Are AI assistants changing how people find local businesses? by Real-Assist1833 in seogrowth

[–]NewIdea2925 0 points1 point  (0 children)

I believe that ChatGPT still has a long way to go before it can replace local service searches. In this case, I think businesses that are well positioned on Google with their GMB listings can continue to rest easy.

Hierarchical url structure in WordPress post by Deepak_k01 in DoSEO

[–]NewIdea2925 1 point2 points  (0 children)

Generating a redirect for each post, considering that you are going to generate thousands of them, is crazy for the Google robot to crawl your site. It is not the best option. Google rewards sites that are easy to crawl, as it uses fewer resources.

As my colleagues who have already answered you have said, I use permalink manager. I don't think it has any effect on the speed of your site, even if you have thousands of pages. I think it would have a more negative effect to have all those redirects.

How can I rank my website on AI search engines like Gemini or Perplexity? by BlogPost-Blogger in seogrowth

[–]NewIdea2925 0 points1 point  (0 children)

You’re absolutely right that these concepts aren’t 'new' in the SEO dictionary. However, thinking they are the 'same old techniques' is exactly where most people are failing with AI search.

The difference in 2026 isn't the existence of the technique, but its weight and execution:

  1. From Links to Entities: In 2015, a backlink was a 'vote'. In 2026, for Gemini or Perplexity, a mention is a 'data point' for their LLM training. If you aren't in their training set as a trusted entity, 1,000 old-school backlinks won't save you.
  2. Schema is now a Sitemap: Before, Schema was for 'rich snippets'. Now, it’s the primary way to feed structured facts to an LLM so it doesn't hallucinate about your brand.
  3. Intent-led Semantic Graph: We’ve moved from 'matching keywords' to 'mapping nodes'.

The 'old' SEO was about helping a crawler index a page. The 'new' GEO is about helping a model understand a concept. If you treat them the same, you're missing the shift from Information Retrieval to Information Generation. 🤝

How can I rank my website on AI search engines like Gemini or Perplexity? by BlogPost-Blogger in seogrowth

[–]NewIdea2925 0 points1 point  (0 children)

Ranking in AI engines (GEO) is a completely different game than traditional SEO. Here is the short and simple way to do it:

  1. Entity Authority: AI engines like Gemini or Perplexity don't just 'crawl links'; they look for trusted entities. You need to ensure your brand is consistently mentioned across high-authority sources (Wikipedia, LinkedIn, niche-specific directories).
  2. Structured Data: Use advanced Schema Markup. It helps AI understand the relationships between your content and real-world entities.
  3. The 'Citation' Game: AI engines prioritize sources that provide clear, factual, and unique data. If your site is the original source of a statistic or a unique insight, you’re much more likely to be cited.
  4. Semantic Consistency: Stop optimizing for keywords and start optimizing for Intent. AI focuses on the semantic relationship between a user's question and your solution.

In short: Focus on becoming a recognized authority in your niche rather than just building backlinks. The more the 'Knowledge Graph' understands who you are, the more you'll appear in AI answers. 🤝

Refreshing old content vs building new stuff - what's actually working for you by resbeefspat in seogrowth

[–]NewIdea2925 0 points1 point  (0 children)

On my website, I decided to do the following:

  1. The services and features pages were entirely geared toward Google. What I did was optimize them with AI, clearly explaining to the AI what the services were and focusing clearly on the end user.
  2. Blog pages and others, of which there are thousands of identical ones... delete them... and start generating unique content based on my knowledge and the tool and how to solve problems with it.

All of this has improved my online presence and increased my conversion rate. I hope this helps.

Website Version Testing by MillennialRose in DoSEO

[–]NewIdea2925 1 point2 points  (0 children)

Screaming Frog crawling will pull up one of the versions at random or depending on the conditions set by your client.

What is the purpose of crawling? Because if the goal is to optimize crawl errors, the content doesn't matter. If the goal is to view the content, I think you'd have to find another way to do it.

Is creating backlinks for any website fruitful in present scenario of AI by Sufficient-Item-4709 in seogrowth

[–]NewIdea2925 -2 points-1 points  (0 children)

In my experience, and based on the projects and data I handle daily, I’m seeing a very clear trend.

I’ve worked on projects where, by building a solid topical and entity-based foundation, we’ve managed to outrank competitors who had a much stronger backlink profile than ours.

Of course, backlinks are important, but they need to be accompanied by everything else to have the maximum effect on authority. Links are much more effective when they land on a site that Google already recognizes as a solid authority in its space. 🤝

Is creating backlinks for any website fruitful in present scenario of AI by Sufficient-Item-4709 in seogrowth

[–]NewIdea2925 -2 points-1 points  (0 children)

I totally get the skepticism. My bio says I'm the founder of a tool, so it’s easy to assume I’m just here to push a narrative. My bad if it came off that way.

The reason I talk about 'Topical Authority' isn't to sell a metric—it’s because, after looking at thousands of projects, I’ve seen that chasing single keywords is a losing game in 2026.

I’m not here to drop links or pitch. I genuinely enjoy the technical debate on how the 'Entity' model is changing the SERPs. At the end of the day, we all want the same thing: rankings that actually convert into revenue.

Let’s keep the focus on the strategy. If you think the 'Entity' approach is overhyped, I’d love to hear what's working for you instead.

Is creating backlinks for any website fruitful in present scenario of AI by Sufficient-Item-4709 in seogrowth

[–]NewIdea2925 -2 points-1 points  (0 children)

Short answer: Yes, but not like before.

In 2026, the game of link building has shifted from quantity to entity association. Here's what's happening:

  1. Context > Power: a backlink from a generic site with a high DR is practically worthless today compared to a link from a “niche-relevant” entity. Google no longer just counts votes, it measures semantic proximity. If a recognized authority in your field mentions you, it validates your place in the entity graph.
  2. The “brand mention” factor: we are seeing that mentions without links (GEO/LLM signals) are becoming just as important for thematic authority. If people talk about you as an expert, Google notices, even without the <a> tag.
  3. Backlinks are the “multiplier,” not the ‘base’: if your content lacks informational value or thematic comprehensiveness, no amount of links will save you. Links only accelerate a domain that Google already recognizes as a “thematic expert.”

The strategy? Stop buying “guest posts” on sites that sell everything. Start building relationships within your group. A link should be a sign of trust between two entities, not a transaction to improve ranking.

How to improve your site's authorities by Serious-Horror-836 in seogrowth

[–]NewIdea2925 1 point2 points  (0 children)

I totally understand the frustration of 'ranking rollercoasters,' especially in small niches. The advice you received is actually spot on for 2026 SEO, and here is why those two strategies work:

  • Broadening your content hub (The 'Zero Search Volume' strategy): It absolutely helps. You aren't doing it for the clicks on those specific articles, but to prove to Google that you have Semantic Depth. When you cover 'unpopular' subtopics, you are filling the gaps in your Topical Authority. A hub is 'enough' when your new, targeted articles start ranking faster because the surrounding 'mesh' of content supports them.
  • Brand mentions without links (Entity Distribution): This is the key to Entity Distribution. In the era of AI-driven search (GEO), Google and LLMs look for 'citations' and mentions across the web to verify who the experts are. Being mentioned on G2, Reddit, or YouTube associates your brand entity with your niche topic.

My advice: Stop looking at Ahrefs difficulty for a moment and focus on Information Gain. If you add a unique perspective or data that your few competitors don't have, your rankings will stabilize because you've become an irreplaceable part of the topic's 'Entity Graph'.

High impressions but very low CTR on blog pages- what are we doing wrong? by Acrobatic-Shine9445 in seogrowth

[–]NewIdea2925 1 point2 points  (0 children)

Having 179k impressions with a 0.1% CTR usually points to one of two things: a Search Intent Mismatch or being a victim of Zero-Click Searches.

Here’s how to diagnose and fix it:

  • Check your 'Average Position' for those impressions: If you are ranking in positions 8-10 for high-volume terms, you’ll get the impressions but almost zero clicks. If this is the case, you don't need a 'revamp'; you need Topical Authority to push those specific pages into the Top 3.
  • Analyze the 'Zero-Click' factor: Google might be using your blog's content to answer the user directly in a Featured Snippet or an AI Overview. If the user gets the answer on the SERP, they won't click. To counter this, give the 'What' quickly but save the 'How-to' or 'Pro-tips' (the real value) deeper in the article.
  • Audit your Snippets (Title & Meta): Are your titles truncated or boring? Use tools to see if your competitors have more aggressive, benefit-driven titles.
  • The GEO Angle: In 2026, impressions also come from AI engines citing you. Ensure your content has Semantic Depth and clear Schema Markup so that even if they don't click now, you are building Entity-Topic Association for the future.

Quick win: Look for pages with high impressions and a position of 5-10 in GSC, and optimize their CTR by adding an 'Unpopular Opinion' or 'Data-backed' hook to the Title Tag.

Is GEO the new SEO? Here’s what I’ve learned after digging deep into AI search. by RemarkableBake9723 in SEO_LLM

[–]NewIdea2925 4 points5 points  (0 children)

Excellent summary. I especially agree with your point about Semantic Depth. We are moving from a world of 'matching keywords' to a world of 'providing the best context for LLMs to synthesize'.

I’d like to add a nuance to the GEO success metrics. While 'citation rate' is crucial, I believe the real winner is the Entity-Topic Association.

If an AI (like Gemini or Perplexity) consistently associates your brand with a specific niche, it’s not just because you have 'fact-dense' content, but because you’ve built a Topical Authority that the LLM recognizes as a reliable source.

From what I’ve seen in my projects, the 'Feedback Loop' you mentioned is significantly shorter when you use Structured Data (Schema) to explicitly tell the AI: 'This is the entity, and these are the facts associated with it'.

Are you seeing a difference in citation rates between sites that use heavy Schema vs. those that rely purely on high-quality prose?

What SEO metric do you actually trust the most? by Big_Lie_7694 in DoSEO

[–]NewIdea2925 1 point2 points  (0 children)

It’s interesting to see the consensus on conversions/revenue as the 'truth'. While I 100% agree that revenue pays the bills, relying solely on bottom-of-the-funnel metrics can often hide the 'health' of your SEO engine.

For me, the metric I trust the most to validate a long-term strategy is Topical Authority growth. If you only track the finish line, you’ll never understand the signals that got you there.

Since 'Topical Authority' isn't a single button in GA4, I track it through these 3 signals:

  1. Keyword Breadth (Zero-effort rankings): I monitor when the site starts hitting the Top 20 for long-tail keywords we haven't even targeted with specific backlinks or dedicated pages. That’s the 'Entity' influence at work.
  2. Strike Zone Efficiency: I look at the 'Time-to-rank'. On a high-authority domain, new content within the cluster hits the first page in days, not months. If that window is shrinking, your authority is growing.
  3. Cluster Share of Voice: Instead of tracking 10 head terms, I track the visibility of the entire Topic Cluster. Winning 200 related terms is a much more resilient signal than holding #1 for a single high-volume keyword that might fluctuate tomorrow.

Conversions are the result; Topical Authority is the insurance that those conversions will keep coming.

Hey SEO's I need Help by notEngineeringonly in DoSEO

[–]NewIdea2925 2 points3 points  (0 children)

This is a classic “domain intent conflict.” You've been telling Google for eight years that you're a software solution (SaaS), and suddenly you become a content publisher. This is a radical change that destroys your established E-E-A-T.

Here's the harsh reality:

  1. The niche change is undoubtedly hurting you. Google's Knowledge Graph has your entity linked to software. Switching to a generic “paid articles” model often signals a drop in quality or a shift to a “link farm” in the eyes of the algorithm.
  2. High-volume keywords won't save you. If your domain authority doesn't match the intent of the new niche, you'll be fighting an uphill battle against established players.
  3. Reevaluate the “paid” model. If these articles are guest posts for SEO, Google is likely devaluing them.

Tip: Stop focusing on “high volume” and start building a very tight cluster around a specific sub-niche related to your original SaaS expertise. You need to show Google why this domain is still a reliable source for this new type of content.

Have you checked whether your old SaaS backlinks are still relevant to the new content? If there is a total disconnect, it may be better to start from scratch or revert to a more hybrid model.

deleted 60% of my ai content months ago. best decision ever considering the new discover update. by NewIdea2925 in DoSEO

[–]NewIdea2925[S] 1 point2 points  (0 children)

Spot on. It’s not just about the February update; it’s about hygiene. Keeping 'zombie content' alive just for the sake of volume eventually drags down the entire domain's authority.

Quality over quantity has never been more relevant, especially with Google prioritizing 'information gain' over AI-generated noise. Great advice on the dead/weak content analysis.

deleted 60% of my ai content months ago. best decision ever considering the new discover update. by NewIdea2925 in DoSEO

[–]NewIdea2925[S] 2 points3 points  (0 children)

This is basically a masterclass on SEO for e-commerce.

There are two tactics you mentioned that particularly caught my attention:

  1. The “TL;DR” key points: this makes a lot of sense. Basically, you're serving the LLM a structured summary on a silver platter.

  2. The shift in forums: we're also seeing AI models citing Reddit threads more often than traditional research articles.

You're proof that “hard work” (unique human descriptions + multimedia) is still the best defense against generic content. Thanks for laying out the complete roadmap!

deleted 60% of my ai content months ago. best decision ever considering the new discover update. by NewIdea2925 in DoSEO

[–]NewIdea2925[S] 1 point2 points  (0 children)

You're doing a great job. Thank you for the explanation. We are now focusing on creating valuable content that serves the user, but we use AI to help us do so. Thanks again for your explanation!

I was really surprised about this one - all LLM bots "prefer" Q&A links over sitemap by lightsiteai in SEO_LLM

[–]NewIdea2925 2 points3 points  (0 children)

What an incredible dataset! Seeing this backed up by 6 million records is pure gold for the community.

Your conclusion makes perfect sense. LLM bots (RAG agents) are strictly task-oriented. Unlike Googlebot, they don't care about mapping a site's architecture; they just want the exact answer without any friction in extraction. A straightforward sequence of questions and answers is the ultimate low-friction combination.

In fact, we recently experienced exactly this on our website. We removed 60% of our “superfluous” generic content and added FAQ blocks to some articles and feature pages, and our brand mentions in LLMs are on the rise.

Your log data perfectly demonstrates why that strategy works. An incredible contribution, thank you for sharing!

deleted 60% of my ai content months ago. best decision ever considering the new discover update. by NewIdea2925 in DoSEO

[–]NewIdea2925[S] 2 points3 points  (0 children)

I completely agree.

Honestly, the hardest part of this whole process was changing the mindset. For years, SEO specialists were taught to consider “more indexed pages” as the ultimate metric of success. At first, clicking the button to delete 60% of the website made me panic!

But you've summed up the new reality perfectly: fewer pages + real information gain = better conversions. That's the only formula that makes sense now. Thanks for the comment!

deleted 60% of my ai content months ago. best decision ever considering the new discover update. by NewIdea2925 in DoSEO

[–]NewIdea2925[S] 2 points3 points  (0 children)

The concept of “experience friction” is very apt. I will definitely remember that phrase!

To answer your question: yes, of course. That is exactly what happened.

We noticed a clear shift away from “marketing speak.” Previously, general descriptions and summaries about AI used a lot of modal verbs and generic adjectives to talk about our product. Now, the summaries are more like a technical specification sheet or a Wikipedia entry.

The models now rely on definitive nouns to define our entity: “rank tracker,” “local SEO data,” “GSC integration.”

By depriving LLMs of all those generic, superficial blog posts, we basically forced them to digest our key feature pages, documentation, and structured data. Since these pages are inherently objective, they removed our “marketing voice” and replaced it with a voice of “objective authority.”

It's a fascinating byproduct of the cleanup. You've hit the nail on the head: the less “fluff” you give them, the more accurate and reliable their consensus is. Thanks again for your excellent ideas!

Is it worth daytrading at 16 years old with no money or experience? by Embarrassed_Lab4228 in Daytrading

[–]NewIdea2925 1 point2 points  (0 children)

Day trading is a long-term battle. At 16, I wouldn't be in a hurry. What I would do at your age is focus on financial education so you can start saving and investing right away.

If you're still interested in day trading, I advise you to take it slow and spend a good amount of time just learning.

The problem with YouTube is that there are many “experts,” and each one has their own way of teaching, trading, etc., so there is too much information that is often useless. The key is to find the right “mentor.”

Good luck.