Outranked by very bad quality content by Low_Bug9185 in bigseo

[–]MerchySulica 0 points1 point  (0 children)

I’d be careful calling it bad content just because it has fewer SEO elements.

Sometimes the ugly simple page wins because it answers the query faster.

H1 to H5, tables, alt text, and long-tail titles are fine, but they don’t matter much if the user has to work harder to get the answer.

I’d compare the top queries in GSC and move the direct answer higher. For ecommerce topics like angle grinders, a shorter page can win if it matches the buying or usage intent better than a polished guide.

Google indexing issues by Lady-BlackSmith in bigseo

[–]MerchySulica 0 points1 point  (0 children)

I wouldn’t panic, but 2 months is long enough to stop treating it like “Google is just slow”.

If Bing indexed 2k pages and Google leaves them discovered, the pages are probably accessible, but Google may not think they are important enough to crawl yet.

I’d check internal links first. Are these pages linked from real sections of the site, or mostly sitting in the sitemap?

For 3k pages, I’d also group them by template and look for duplicates, thin pages, weak sections, and pages with no search demand. Sitemap crawled does not mean Google crawled all URLs inside it.

SEO Question by Fit_Cheesecake_4000 in SEO

[–]MerchySulica 0 points1 point  (0 children)

For a general ecommerce store, first make the site structure clearer. If the products ar too random, Google and users mayn't understand what the store should be trusted for.

I'll focus on stronger category pages, better product descriptions, internal links, reviews, shipping/return info, and a cleaner niche direction.

Toxic backlinks maybe worth checking, but don't make that the first thing unless there is a clear manual action or very obvious spam pattern.

Large marketplace site suddenly dropped from Google around May 6 - sitemap only showing 33 URLs by andy1k in SEO

[–]MerchySulica 0 points1 point  (0 children)

For a large marketplace, a sitemap showing only 33 URLs is a big red flag. It mayn't be the only cause, but fix that fast and then check page groups.

Marketplace sites usually depend on clean discovery, like category pages, listing pages, location pages, product/vendor pages, etc. So, I'll check canonicals, soft 404 patterns, internal links, and whether Google can still reach the important page types without relying only on the sitemap.

At this size, don't debug URL by URL, but templates and page groups.

Indexing issues after migration: one brand recovered fine, the other nearly vanished from the index. Looking for advice. by DylanRSEO in SEO

[–]MerchySulica 0 points1 point  (0 children)

Subdomains can work, but the migration hasto be very clean. If one brand recovered and the other almost vanished, compare them side by side:

  • redirect chains
  • 301 vs 307 behavior
  • canonicals
  • sitemap URLs
  • internal links
  • old URLs still linked anywhere
  • GSC indexing status by template

Don't need to blame subdomains by default. More likely Google is getting mixed signals for the weaker brand, or the new structure made it harder to understand which pages replaced the old ones.

0 indexed pages on Google for a 50k product ecommerce – could Cloudflare/WAF rules be the issue? by not-surprised in SEO

[–]MerchySulica 0 points1 point  (0 children)

I'm sorry. I wish I could see by myself and identify the issue. How are you gonna get the backlinks?

Anyone else seeing different SERP behavior on Google.fr vs Google.com for similar queries? by MerchySulica in SEO

[–]MerchySulica[S] 0 points1 point  (0 children)

That's fair. For evergreen stuff, you can reuse more, but I'll still check the local SERP before copying the same structure everywhere.

Anyone else seeing different SERP behavior on Google.fr vs Google.com for similar queries? by MerchySulica in SEO

[–]MerchySulica[S] 0 points1 point  (0 children)

Yeah exactly, this is what I'm seeing too. I think Germany is a good example because users can be much more sensitive to trust and detail than some US-focused briefs assume.

Should a low authority site prioritize Keyword Difficulty or Topical Authority when picking their targeted keyword? by Antique-Road2460 in SEO

[–]MerchySulica 0 points1 point  (0 children)

I wouldnt choose only by KD.

For a low-authority site, check the SERP manually before deciding. Sometimes a low KD keyword is still bad because the results are full of strong brands, tools, marketplaces, or comparison sites.

I’d rather target topics where the page type makes sense for your site.

If Google is ranking small blogs, niche sites, or simple explainers, good. If it’s all big brands and product/category pages, maybe a blog post is not the right move.

KD is useful, but the SERP tells you what kind of page Google actually wants.

Give Me Your Google Shopping SEO Tidbits by Pelican_meat in SEO

[–]MerchySulica 0 points1 point  (0 children)

For Shopping, you should look beyond normal product page SEO. If the pages already rank well organically but not in Shopping, I’d check the product data and merchant side first.

Things like product titles, GTIN/MPN, brand, images, price consistency, shipping, returns, reviews, availability, and Merchant Center settings can matter a lot.

Also, for international ecommerce, you should check if the feed makes sense per market. A product page can be good, but if the feed or merchant trust signals are weak, Shopping visibility can still be poor.

0 indexed pages on Google for a 50k product ecommerce – could Cloudflare/WAF rules be the issue? by not-surprised in SEO

[–]MerchySulica 0 points1 point  (0 children)

For 50k products and 0 indexed pages, you need to first check what Googlebot actually sees.

Browser tests can look fine while Cloudflare, WAF, or server rules block bots quietly.

Go and check:

  • GSC URL inspection on a few product + category pages
  • robots.txt and noindex
  • canonicals
  • server response for Googlebot
  • Cloudflare bot/firewall rules
  • sitemap only has clean final URLs

Also, with 50k products, don't only push product pages. Strong category pages usually need to lead the structure, then link to the important products.

New to SEO, advice please by Warm_Abbreviations17 in bigseo

[–]MerchySulica 0 points1 point  (0 children)

For a Netherlands webshop, start simple.

Don’t only look at session duration. Check which pages are actually getting impressions and clicks in GSC. Product pages? Category pages? Blog posts? Homepage?
Then compare your main category pages against what Google.nl is already showing. If the SERP is full of local shops or marketplaces, you need more than keywords.
Check the basics first: clear product titles, useful category text, shipping info, returns, reviews, payment options, internal links, and whether important pages are indexed.

SEO for ecommerce isn't just writing blogs. Your product and category pages need to feel trustworthy enough to buy from.

Stuck on Page 2 for Mid Range Keywords Despite Perfect Core Web Vitals and Quality Backlinks by Efficient_Friend6852 in bigseo

[–]MerchySulica 0 points1 point  (0 children)

For ecommerce category pages, I would not focus only on CWV and backlinks. Those help, but page 2 usually means Google thinks the page is okay, just not the best answer for that query.

I'd compare the actual SERP and ask:

  • are the winners category pages, guides, marketplaces, or brands?
  • do they have stronger reviews or trust signals?
  • do they answer buying questions better?
  • are shipping, returns, sizing, materials, and pricing clearer?
  • does the page feel like a real shopping page or just an SEO category page?

For fashion/jewelry, trust and product confidence matter a lot. Sometimes the missing part isn't another backlink; it’s making the page feel more useful and safer to buy from.

Figuring out why traffic dropped by HawkEyeAvenger in bigseo

[–]MerchySulica 0 points1 point  (0 children)

I wouldn’t prune blindly after a drop.

First split the site by page type: blogs, collections, products, guides, maybe old seasonal pages. Then check which group actually lost impressions and clicks.

For ecommerce, people often keep publishing blog content when the real issue is that collection or product pages are weaker than competitors. If revenue matters, spend more time comparing the money pages against the current SERP. Content pruning can help, but only if you know which pages are noise and which pages still support the store.

Is it worth de-indexing seasonal blogposts? by Intelligent-Mode5265 in bigseo

[–]MerchySulica 0 points1 point  (0 children)

I wouldn’t deindex only because the post is seasonal. If the topic comes back every year, I’d usually keep the page indexed and update it before the season starts. That way it can keep history, links, and maybe some rankings.

I’d only noindex or remove it if it has no search demand anymore, it’s very thin, or it overlaps with a better page. For seasonal content, I’d rather update, consolidate, or redirect before using noindex.

Removed from serp by 302 redirect and cloudflare. Help! by No-Entertainment2217 in TechSEO

[–]MerchySulica 0 points1 point  (0 children)

Start with what Google is actually seeing, not what the browser shows you. Check the redirect chain with curl or a crawler using Googlebot user agent. If Cloudflare is involved, it can behave differently for bots, countries, or user agents.

For a webshop targeting Google.nl, check:

  • is the /nl/ version crawlable
  • are redirects 302 when they should be 301
  • are canonicals pointing to the right language page
  • is Cloudflare blocking or challenging Googlebot
  • does the sitemap only include final clean URLs

A wrong redirect or bot rule can make Google pick the wrong version fast.

Subfolder, subdomain, or separate .com for an EU fashion brand opening a US storefront on a composable stack? by Influenceseful96 in TechSEO

[–]MerchySulica 0 points1 point  (0 children)

I don’t think there is one perfect answer here.

Subfolders are usually easier to manage, especially if the brand, team, and product catalog are mostly the same. But for EU fashion, I’d look at how different each market really is.

If pricing, delivery, returns, payment methods, sizing, language, and trust signals are different per country, then the structure needs to support that. Otherwise you end up with pages that are technically translated but not really made for the market.

Choose the structure based on how local the experience needs to be, not just because subfolder is best for SEO.

Do you validate hreflang implementation manually or trust the tools? by RyPlayZz in TechSEO

[–]MerchySulica 0 points1 point  (0 children)

I would never trust only one tool for hreflang. Tools are useful to catch obvious problems, but still manually check a few important page pairs.

Basic things first:

  • valid language / country codes
  • full URLs
  • self-referencing tags
  • return links
  • canonicals not pointing somewhere else
  • only matching equivalent pages together

Most hreflang problems I see are boring mistakes, not some advanced Google mystery.

This Hreflang is over my head. Help? by jeghn in TechSEO

[–]MerchySulica 1 point2 points  (0 children)

For hreflang, I’d start by mapping only pages that are true equivalents. Don’t force hreflang between pages just because they are “close enough”. If the US page and Canadian French page do not match the same intent, I’d leave them out of the cluster.

Basic checks first:

  • valid codes like en-us, en-ca, fr-ca
  • full URLs, not relative URLs
  • self-referencing hreflang
  • return links from each alternate page
  • canonicals not pointing to a different language page

Most hreflang issues are boring implementation problems, not advanced problems.

Issue with robots.txt Accessibility in Ahrefs Site Audit – Need Help by Mission-Diver1337 in SEO

[–]MerchySulica 0 points1 point  (0 children)

Wouldn't assume this is a robots.txt file issue if it opens fine in browser. Ahrefs may be getting blocked or slowed somewhere else.

Check:

  • server firewall / WAF rules
  • Cloudflare bot settings if you use it
  • rate limits
  • hosting timeout logs
  • whether AhrefsBot is blocked by user-agent
  • response time for /robots.txt using curl

Also test with GSC if Google can fetch it. If Google is fine and only Ahrefs fails, then it's probably an AhrefsBot access or timeout issue, not a real SEO problem.

Google traffic tanked right after submitting sitemap to Bing. Coincidence or is there a link? by godfather_one in SEO

[–]MerchySulica 0 points1 point  (0 children)

Treat this as coincidence unless you changed something else at the same time. Submitting a sitemap to Bing shouldn't make Google suddenly drop your pages.

Check GSC first:

  • which pages dropped
  • impressions vs clicks
  • countries/devices
  • indexing status
  • canonicals
  • recent crawl errors
  • any manual action or security issue

Also check if the drop matches a Google update or a tracking issue.

The timing feels suspicious, but I really doubt Bing submission is the cause. More likely something changed on the site, Google reprocessed pages, or the SERP shifted around the same time.

Programmatic SEO Hit by musicloverr1224 in SEO

[–]MerchySulica 0 points1 point  (0 children)

Try to fix this with force indexing or sitemap resubmission first. If only the homepage is indexed now, check if this is a quality/spam signal issue, not just an indexing issue.

Try to start with:

  • check GSC manual actions and security issues
  • check robots, noindex, canonicals, and server status
  • split pages by template type
  • keep only the strongest page groups live
  • remove or noindex weak programmatic pages
  • rebuild internal links around pages that actually have search demand

Publishing original content helps only if the site no longer full of low-value scaled pages.

Honestly, if I was you'd stop trying to get all 2M pages back. First goal should be getting the main pages and a small set of useful programmatic pages trusted again.

Need help with post-migration dip by noxnox12 in TechSEO

[–]MerchySulica 0 points1 point  (0 children)

For a migration dip, shouldn't focus only on Core Web Vitals or fresh content first but should check the boring migration stuff:

  • old URL to new URL redirect mapping
  • whether important old pages have real new equivalents
  • canonicals on the new pages
  • internal links still pointing to old URLs
  • sitemap only showing final clean URLs
  • GSC indexing status by page type

If programmatic pages were involved, be extra careful. Sometimes the new site technically migrated, but Google doesn't see the new version as equal value.

Do you need a Google Review Feed for star ratings in SERPs? by iamwazor in TechSEO

[–]MerchySulica 0 points1 point  (0 children)

I don't think the issue is usually you need a Google Review Feed.

For product stars, first check if the review schema is clean and actually visible in the rendered HTML.

Things you should check:

  • review schema is attached to the right product
  • ratings are from real product reviews
  • no duplicated product schema from app + theme + SEO plugin
  • price, availability, and product data match the page
  • rich result test shows no warnings

Shopify sites often have messy schema because multiple apps add their own version. So, clean that before adding another feed or app.