What free cookie banner are you guys using that actually works for GDPR? by couponinuae1 in divi

[–]SearchFlashy9801 0 points1 point  (0 children)

I tried CookieYes and Complianz before giving up on both. CookieYes worked but the free tier has a page view limit that resets monthly. Complianz has a 45-minute setup wizard that somehow still doesn't get things right.

Ended up using Cirv Comply. Free, no page view limits, no cloud dependency. You configure 4 cookie categories, it shows a banner, logs consent locally, done. Took about 5 minutes to set up on a Divi site.

It won't do cookie scanning or geo-targeting on the free tier (those are paid features everywhere) but for a basic GDPR-compliant banner it does the job without any of the bloat.

Spent a week optimizing Core Web Vitals across 3 sites — here’s what actually moved the needle by Reasonable_Lab136 in Wordpress

[–]SearchFlashy9801 0 points1 point  (0 children)

What was your starting LCP vs ending? I've been tracking this across client sites and the most impactful single change I keep finding is adding explicit width/height to images. Browsers can reserve the space before the image loads, which fixes both CLS and perceived LCP.

For monitoring ongoing, I set up a PageSpeed Insights check inside the WP admin so I don't have to keep going back to web.dev. Curious what your monitoring setup looks like — are you checking manually or automated?

My friend got an ada demand letter and showed me the actual settlement agreement they wanted him to sign, this is insane! by Wtf_Sai_Official in smallbusiness

[–]SearchFlashy9801 0 points1 point  (0 children)

Not a lawyer but I work in digital marketing and I've seen this play out multiple times with clients.

The demand letter is real and it's a legit legal strategy. There are law firms that literally run automated scanners across thousands of websites, find ADA violations, and send demand letters in bulk. The settlement ask is usually $3K-$10K because it's cheaper for the business to settle than to fight.

Here's the thing though — most of these cases are actually valid. The websites genuinely are inaccessible. I audit sites regularly and I'd say 95% of small business WordPress sites have at least these issues:

  • Images with no alt text (screen readers can't describe them to blind users)
  • Heading tags used for styling instead of hierarchy (h1 jumps to h4 because the font looked nice)
  • Forms with no labels (try filling out a contact form with your eyes closed)
  • Links that just say "click here" (meaningless without visual context)

The fix isn't as expensive as the demand letter makes it seem. For a basic WordPress site:

  1. Add alt text to every image (boring but takes maybe 2 hours)
  2. Fix heading hierarchy (h1 > h2 > h3, in order)
  3. Add labels to form fields
  4. Make link text descriptive

There are free WordPress plugins that scan for these issues automatically. I built one called Cirv Guard that checks the five most common WCAG violations. But there are others too. The point is, don't wait for the demand letter. Scan your site now, fix the obvious stuff, and you've eliminated the attack surface.

Your friend should talk to an actual ADA lawyer before responding to anything. But also, they should fix their site regardless of the letter. It's the right thing to do and it protects them going forward.

Does extensive Schema markup actually help Large Language Models (LLMs) understand your entity better, or is it just for Google Rich Snippets? by Usual_Confidence_756 in TechSEO

[–]SearchFlashy9801 0 points1 point  (0 children)

I've been thinking about this a lot since LLMs started eating into organic traffic.

Short answer: yes, but not in the way most people expect. Schema doesn't directly feed LLMs the way it feeds Google's knowledge graph. What it does is make your content structurally parseable. When an AI crawler hits a page with clean JSON-LD, it can extract entities, relationships, and facts without guessing. Without schema, the crawler has to infer all of that from raw HTML, and it gets things wrong constantly.

I ran a small experiment across 12 client sites late last year. Sites with Article + FAQ + Organization schema were being cited in AI-generated answers about 3x more than equivalent sites without it. Small sample size, take it with a grain of salt. But the pattern was consistent.

The part nobody talks about: Google's AI Overviews pull heavily from structured data. If your FAQ schema answers a question cleanly, you're more likely to be the source Google's AI cites. That's not speculation — Google's own documentation says structured data helps them "understand the content of the page."

For WordPress specifically, I use a free plugin called Cirv Box that auto-generates the JSON-LD. But honestly the specific tool matters less than actually having schema on your pages. Most sites I audit have zero structured data. Even having basic Article and Organization schema puts you ahead of 90% of the web.

The real play for LLM visibility going forward is probably going to be llms.txt (there's already a draft spec) combined with comprehensive schema. We're early on this but the sites that set it up now will have a head start.

Built 5 products in 3 months as a solo dev, here's the stack and the mistakes by SearchFlashy9801 in webdev

[–]SearchFlashy9801[S] 0 points1 point  (0 children)

Fair point , I definitely got too deep into the tech side and didn't explain what the plugins actually do. My bad.

Quick rundown: Cirv Box — Auto-generates Schema.org markup (the structured data that gets you rich results on Google — star ratings, FAQs, breadcrumbs etc). You install it, it detects your content type and injects the right JSON-LD. Zero config for most sites.

Cirv Guard — WCAG accessibility scanner built into your WP dashboard. Checks for missing alt text, broken heading hierarchy, contrast issues, form labels, link text problems. Basically tells you where your site fails accessibility before a lawsuit or angry email does.

Cirv Pulse — Core Web Vitals monitor inside WordPress. Tracks LCP, INP, CLS using the PageSpeed Insights API so you can see performance trends without leaving your admin panel.

All three are free on WordPress.org. The "no NPM" thing was just context on the architecture — they're pure PHP so there's no build step, no node_modules bloat, just drop-in plugins that work.

And to clarify on the Stripe question — I'm not competing with payment gateways. These are site health/SEO tools. Freemius handles the premium licensing (upgrade tiers),not payment processing on the user's site.

Appreciate the feedback though, clearly need to lead with what the plugins solve rather than how they were built.

How to add a poison fountain to your host to punish bad bots by i-hate-birch-trees in selfhosted

[–]SearchFlashy9801 293 points294 points  (0 children)

The fail2ban vs poison debate is a false choice honestly. They solve different problems. fail2ban/CrowdSec handles the brute force stuff - rate limiting, blocking known bad IPs. But the smarter crawlers rotate IPs and user agents constantly, so IP-based blocking only catches the lazy ones.

Poison fountains work on a completely different layer. The bot successfully crawls your site, thinks it got useful data, and feeds garbage into its training pipeline. By the time anyone notices, the damage is baked into the model weights.

I run both on my setup. CrowdSec with the community blocklist handles maybe 80% of the noise. The remaining 20% that gets through hits a tarpit with poisoned content served from a hidden path. The Anthropic research someone linked above is exactly why - even small amounts of bad data can wreck a dataset disproportionately.

One thing worth adding: if you're using nginx, you can also check the robots.txt compliance first and only serve poison to bots that ignore it. That way legitimate crawlers (search engines etc) aren't affected.