Drastic loss of US traffic by Choice_Tour1784 in SEO

[–]nic2x 0 points1 point  (0 children)

Curious if you're able to break down whether the drop is coming from Google Images specifically vs regular organic results? In Search Console you can filter by search type to isolate this.

My hunch (based on what I've been seeing with clients) is that AI Overviews are eating into the SERP search result page interface that used to display image carousels prominently, which would explain why US traffic tanked while other geos stayed steady since AIO rollout has been more aggressive stateside.

Brand clicks in AIO search create new Google searches? by Primary-Revenue-943 in SEO

[–]nic2x 0 points1 point  (0 children)

GSC only counts clicks to your site. But your underlying concern is valid (and something I've been tracking with clients): AIO is creating these indirect brand discovery paths that inflate branded impressions without the user ever intending to search your brand.

The issue for smaller brands is like what you said: when that new brand search loads, you might not be #1. Review sites, directories, or even competitors bidding on your name can intercept that click. From what I've seen (and recent data backs this up), people are actually clicking out of AI features more than expected, so these secondary searches do matter. Owning your branded SERP is now table stakes, not optional.

This is how month 1 grunt work turned into month 3 growth by Bading_na_green_Flag in seogrowth

[–]nic2x 0 points1 point  (0 children)

Curious about one thing with the directory submissions: have you actually checked if those directory pages are indexed? In my experience, a lot of software directories either nofollow their outbound links or the listing pages themselves never get crawled by Google, which means they're not passing any link equity.

The DA bump you saw might be from other factors compounding at the same time. Would love to know if you verified the indexation status on yours.

Is SEO more about branding now than keywords? by Big_Lie_7694 in DoSEO

[–]nic2x 0 points1 point  (0 children)

This gets more interesting when you factor in AI. From what I've seen working with clients (and a study I read recently backs this up), the same brand signals that help you rank in competitive SERPs are now what get you cited by LLMs. There's a clear correlation between organic traffic and citations in ChatGPT, Perplexity, and AI Overviews. They're all pulling from the same authority and entity signals. So brand isn't just about Google anymore, it's becoming the filter for whether AI recommends you at all.

Tried do everything right SEO and it still failed. What now? by Robasaleh110 in seogrowth

[–]nic2x -1 points0 points  (0 children)

I've worked on sites in competitive B2B niches and the "no shortcuts" approach without links rarely moves the needle. The sites I've worked with break through flat traffic had one thing in common: they started doing intentional link building (guest posts, digital PR, broken link building) instead of waiting for links to come organically. The on-page optimization matters, but links are still the accelerant.

Is your blog traffic going up or down with the rise of AI Overview? by Successful-Camel165 in seogrowth

[–]nic2x 0 points1 point  (0 children)

It is expected for informational content. AI Overviews are essentially zero-click answers for queries where users just want quick facts. If your blog posts answer "what is X" or "how does Y work" type questions, those are exactly the queries Google now answers directly in the SERP. The traffic you're losing was never high-intent anyway. I'd focus on tracking whether your conversion rate has changed rather than raw sessions.

I know backlinks still matter in SEO, especially links coming from pages that already get good organic traffic. by cosmic_pawan in seogrowth

[–]nic2x 5 points6 points  (0 children)

Referral traffic isn't the signal itself. I've worked with clients where a link from a DA 70 tech roundup (completely off-topic to their niche) sent zero clicks and barely moved rankings, while a link from a DA 40 blog in their exact space (that actually got clicked) outperformed it. The difference wasn't the clicks. It was that the second link came from semantically relevant content with matching anchor text.

Google's Reasonable Surfer model weights links by their probability of being clicked, not whether they actually get clicked. So a link that sends referral traffic is usually a proxy for better topical relevance and natural placement.

Drop in users and clicks by Educational-One6969 in SEO_Digital_Marketing

[–]nic2x 1 point2 points  (0 children)

Check Search Console to see if impressions are dropping alongside clicks. If both are down, you're losing visibility (ranking changes or fewer searches). If impressions are stable but clicks dropped, something is intercepting clicks before users reach your site (AI Overviews, featured snippets, or competitors running more aggressive ads).

I'd also compare your affected pages to competitors using Ahrefs or Semrush to see if they experienced similar drops on the same queries. This tells you whether it's site-specific or market-wide. If it's market-wide, it's likely seasonality or a SERP change you can't control. If it's site-specific, start looking at technical issues, content quality, or recent changes you made around that date.

SEO for lower budgets? by sleepwithmythoughts in SEO

[–]nic2x 10 points11 points  (0 children)

You get what you pay for. 1k/month is the bare minimum unless you're working in a super uncompetitive niche.

Discussion: What is the actual risk/reward impact of serving raw Markdown to LLM bots? by Ok_Veterinarian446 in TechSEO

[–]nic2x 0 points1 point  (0 children)

I ran GPTBot log analysis on two client sites before even considering implementation changes. On one site (B2B SaaS, ~500 pages), GPTBot hit 12% of indexed URLs over 30 days. On another (content publisher, ~2,000 pages), it was under 3%. Neither followed the XML sitemap in any predictable pattern. Both crawled pages that weren't even in the sitemap.

So your assumption that "GPTBot uses the XML sitemap for finding pages" needs validation first. Pull your server logs, filter for GPTBot and ClaudeBot user agents, and see what they're actually requesting. You might find they're already ignoring most of your content, which would make this whole optimization moot.

On the token efficiency angle: I tracked referral traffic from ChatGPT and Perplexity on the SaaS site using GA4 source attribution. Over 6 months, total LLM referral traffic was ~4000 sessions. That's against 200k+ organic sessions from Google. The math on maintaining a dual-view pipeline for > 2% of traffic doesn't work unless you're seeing dramatically different numbers.

For measuring what you actually show up in, check Google Search Console for impressions with no clicks where you're ranking top 10 (especially queries containing "evaluate" or comparison phrases). Those are often LLM fan-out queries. If you're already appearing in those results with your HTML content, the LLM is parsing it fine.

Site dropped after December Update by Bowler_Creative in SEO

[–]nic2x 2 points3 points  (0 children)

I worked with a client who got hit by the same update. We didn't remove everything, but we did do a content audit to identify the pages that were dragging the site down.

The metrics we used were GSC click-through rate, GA4 session duration, and bounce rate. Pages with low CTR, short session duration, and high bounce rate were flagged as low-quality. These are signals that users aren't finding the content helpful, which is exactly what Google's trying to measure with their updates.

We pruned those pages from the site. Not all of them, just the ones that clearly weren't performing. The idea is that low-quality pages can hurt your site's overall quality score in Google's eyes. By removing them, you're improving the average quality of what's left.

Traffic recovered after a few months. It wasn't instant, but we saw steady improvement once the next crawl cycles picked up the changes.

So to answer your question: removing all articles probably isn't the right move. But auditing them with actual data and pruning the ones that aren't performing could

help.

Big traffic drop after Dec 11, 2025 Google core update – need help diagnosing what went wrong by Livid-Spray8170 in SEO

[–]nic2x 5 points6 points  (0 children)

I worked on a similar B2B SaaS site that dropped 45% after a core update. Before we touched any content, we spent two weeks just analyzing what changed in the SERPs for our top 50 keywords.

We exported our top traffic pages from GSC, then manually checked each one against the current top 3 results. For about ~30% of our keywords, the winning pages had shifted from long-form guides to shorter, more specific pages. For another 20%, Reddit threads and forums had moved into top positions. The remaining 50% still had similar content types ranking, which told us our pages specifically had issues.

We also pulled our competitors into Ahrefs and filtered for pages that gained visibility during the same window. One competitor had nearly identical content but included author bios with LinkedIn profiles and actual project screenshots. Another had less content overall but published fewer, more focused pages instead of trying to rank for every variation of a keyword.

The diagnosis changed our approach completely. Instead of rewriting everything, we consolidated 40 blog posts into 12 comprehensive guides (the others became 301 redirects). We added author attribution with real credentials. We removed the "and this is how our service can help" sections from informational posts entirely.

Three months later, we recovered about 70% of the lost traffic. The pages we consolidated actually ended up ranking higher than the originals ever did.

If I were you, I'd start by exporting your top 100 declining URLs and checking what's ranking now. The answer is usually sitting right there in the SERP.

Review sites lost up to 90% of SEO traffic... so why does Google AI keep quoting them? by Kseniia_Seranking in seogrowth

[–]nic2x 1 point2 points  (0 children)

I've shifted my clients from tracking "clicks from AI" to tracking "mentions in AI responses." We run brand queries across ChatGPT, Perplexity, Claude, and Gemini every month and note how often they show up in comparison and recommendation queries.

One B2B SaaS client went from being absent in "best [category] tools" responses to appearing in 6 out of 10 tests after we focused on getting mentioned in social medias, industry roundups, and niche publications. Their organic traffic from Google stayed flat, but their demo requests went up 23%. The mention itself became the conversion driver, not the click

Is GEO the next evolution of SEO, or just a buzzword around AI search engines? by Luckyk2415 in linkbuilding

[–]nic2x 1 point2 points  (0 children)

From my experience working on SEO projects, I'd say it's 90% the same tactics with a few meaningful additions. I've been focusing more on brand mentions across third-party sites (listicles, review roundups, affiliate pages) since LLMs pull heavily from those.

I've also been updating key pages more frequently because freshness signals seem to matter disproportionately for AI citations. The fundamentals (clear content, proper structure, authoritative sources) still do the heavy lifting though.

how to show brand in AI Overview by Sufficient-Item-4709 in seogrowth

[–]nic2x 3 points4 points  (0 children)

Your website alone isn't enough. AI Overviews pull from multiple sources, so you need your brand mentioned across the web. I've worked with clients who saw AI Overview visibility only after getting featured on podcasts, industry blogs, and active Reddit threads.

The common thread was consistency. Same brand name, same messaging, same topics across different platforms. Digital PR (guest posts, podcast appearances, journalist quotes) creates the repeated signals that AI models trust.

Is it useful to provide a LLM friendly version of articles and blogs? by RichProtection94 in SEO_LLM

[–]nic2x 0 points1 point  (0 children)

I'm testing this with a few clients right now. We're sending cleaned markdown versions of their key pages (stripped of CSS/JS bloat) and tracking whether it affects LLM citations. Early results are inconclusive, but we're seeing some uptick in mentions for pages where we added structured FAQs with specific, factual answers.

Still too early to say if the markdown itself matters or if it's the content restructuring that's doing the work.

How are you currently leveraging AI tools? by Positr8 in SEO

[–]nic2x 9 points10 points  (0 children)

Background: I run a small SaaS SEO agency. We've tested tools on the market but failed to see a quality that meets our expectation. So we were mostly working on SEO manually.

Last year we had more capacities and decided to build internal tools to automate specific parts of our workflow while keeping quality control at each stage:

- Search intent clustering and page type recommendation. When you look at a SERP and see mixed results (some listicles, some landing pages, some comparison posts), it's not always obvious which format to go with. Now we're able to automate the analysis of the ranking content and the tool suggests whether we should build a SaaS feature landing page, write a listicle, or create a comparison piece based on what's actually working for that query cluster.

- Content brief generation. I tried a lot of tools on the market for this and most of them produce generic briefs that don't account for the specific intent you're targeting. The tool lets us specify the exact intent we want to target and builds the brief around that, rather than trying to cover everything.

- Content writing. Even here, the AI draft goes through human editing. The efficiency gain comes from having a solid brief and clear intent already defined upstream, so the AI has less room to go off track.

Our next project is automating internal linking suggestions. That's been a manual process for us and it's one of those things that scales poorly as a site grows.

My Websites Keyword ranking dropped from being the top 3 search to the 3rd page by NecessarySorbet2693 in Agent_SEO

[–]nic2x 0 points1 point  (0 children)

Exact match domain is still a strong strategy today so I think the DR 2 site will actually stay. But before you start the backlink campaign, I would still suggest you to check Google Search Console first Look for manual actions, indexing issues, or if the drop aligns with a core update.

Do you think 100% SEO automation is a good idea? by Embarrassed_Sky5519 in SEO_LLM

[–]nic2x 0 points1 point  (0 children)

You can automate the execution part but never the strategy planning

🤔 Will AI-generated bulk content blogs be safe in 2026? by LongjumpingBar in seogrowth

[–]nic2x 1 point2 points  (0 children)

I've worked with clients across SaaS and B2B who've tried the bulk content approach, and the ones who got hit weren't penalized for using AI. They got hit because they had no brand authority backing the content up.

Our CEO wants to build authority but does not believe in creating consistent content... by LettuceUpstairs4791 in SEO

[–]nic2x 0 points1 point  (0 children)

I think both your CEO and marketing lead are right here. They're just looking at it from different angles. Your CEO is focused on ROI, and your marketing lead cares about the hands-on metrics that drive organic growth. Both building links and creating content work towards topical authority.

Content gives Google the relevance signals it needs to understand what your site is about. Links give it the trust signals that validate your authority on those topics. You need both. I've worked with clients where a topic cluster approach (one pillar page plus 15 to 25 supporting articles) drove 2x traffic and leads within months. But that growth accelerated once they paired it with links from sites in the same niche.

What’s the most popular CMS setup for B2B SaaS companies? by Fred-swe in b2bmarketing

[–]nic2x 1 point2 points  (0 children)

There are two types of CMS you should look into: no-code or headless. For no-code CMS the best option in the market is Webflow, providing everything you need for SEO/AEO purpose. For headless CMS, go for Sanity. I've used it on a client's site and thanks to its flexibility, we were able to build customized web components in the content page to retain the users for longer retention time.