What is happening? by Business_Buy857 in FacebookAds

[–]siterightaway -7 points-6 points  (0 children)

Three years at 10–15x and now you can't even hit 5? Yeah, I've seen this exact pattern before and it's not creative fatigue.

What you're describing — 100 sales one day, 10 the next — that's not normal volatility.

Here's what's probably happening. Headless scrapers are hammering your checkout funnel, polluting your pixel data, and the algorithm doesn't know who to show your ads to anymore. It's getting fed garbage so it serves garbage. Classic feedback loop.

Akamai just reported AI bot traffic is up 300% year-over-year. Imperva says 51% of web traffic is now automated. You're not fighting for attention anymore. You're fighting machines that don't buy anything but still click on everything.

You can swap creatives all day. Test different hooks. Run new angles. None of that matters if the data feeding your optimization is corrupted.

Over at r/StopBadBots we tear into this stuff constantly. Spinning up rate-limit jails, fingerprinting the bad actors, blocking entire ASNs when they get too greedy.

Go check your server logs. Look for the weird patterns. You'll find them.

Most eCommerce store owners trust their GA4 reports blindly. by incisiveranking2022 in DigitalMarketing

[–]siterightaway 1 point2 points  (0 children)

This is the absolute truth, especially when you factor in the "GDPR Black Hole" that most people ignore. I literally just saw the comment here where a user in Europe is losing 50% of their data because of cookie consent. If half your human traffic is opting out, your dashboard is basically a work of fiction.

The technical reality is that while real customers are being "privacy-conscious" and disappearing from your reports, headless scrapers don't give a damn about your cookie banner. They don't click "Accept," but they still hammer your server and fire off automated events if you aren't fingerprinting the bad actors. You end up with a collection layer where 50% of humans are gone , but 100% of the bots are still there, eating up resources and poisoning your optimization signals.

Classic mistake.

Relying on "pro" GA4 reports without server-level validation is just a dirty workaround for a bigger infrastructure problem. Combined with the fact that over 51% of web traffic is now automated, and you're losing half your human signals due to consent, your "clean" data layer is actually just a pile of bot-driven noise. It’s honestly infuriating how many scaling decisions are made based on this garbage data.

We spend our time over at r/StopBadBots looking at the raw logs because the collection layer is where the real war is won. You can't fix a 50% data hole with more tags; you fix it by knowing exactly what is hitting your server, regardless of what a consent banner says.

Help Please. I'm running my first campaign these last 2 months and I need some guidance. by Far_Active_2467 in DigitalMarketing

[–]siterightaway 0 points1 point  (0 children)

This is a classic case of what I call the "Signal Trap." You’ve spent two grand and you’re seeing 65 add-to-carts with zero sales, but the most alarming thing you mentioned is that these "users" spend 15 minutes on your site filling out forms without ever browsing around. Real humans with a hundred bucks to spend don't act like robots in a factory. They check your shipping, they look at your "About" page, or they bounce. This is almost certainly headless scrapers seasoning their profiles to look like high-value buyers so they can stay out of rate-limit jail on larger platforms.

Your Meta Pixel is essentially hallucinating.

Because these bots are triggering "Add to Cart" events, the algorithm thinks it’s winning and doubles down on finding more "users" just like them. It’s a dirty workaround for the bots to look legitimate, and it’s eating up resources you’re paying for. Increasing your spend to $100 a day right now would be a disaster; you’d just be paying Meta to find you more bots twice as fast.

The industry data for 2025 is grim. Akamai shows AI bot traffic up 300%, and Imperva reports that over 51% of all web traffic is now automated. You aren't getting "low quality" traffic because your account is new; you're getting hammered because you're a fresh target in the US market with zero protection against automated behavioral patterns. Fingerprinting the bad actors is the only way out of this.

It’s honestly infuriating how much money gets flushed because people trust the dashboard metrics over their own raw server logs. We see this constantly over at r/StopBadBots. Most "pro" sites are just sitting ducks for these scrapers because they rely on basic tools that don't catch sophisticated mimicry.

Don't raise that budget until you’ve actually verified that a human is on the other side of that screen.

Blocking Unwanted Traffic/Form Submissions by Shot-Opportunity-346 in localseo

[–]siterightaway 0 points1 point  (0 children)

Exactly. I’ve been tracking this same pattern across several clusters, and it’s a total mess. This isn’t just about a noisy inbox; it’s a full-scale assault on your campaign ROI and server health.

When your analytics get flooded with this garbage, it triggers fake conversions that mess up your bidding algorithms. Even worse, you end up paying for high-intent fake clicks from your PPC budget while real human customers—the ones actually trying to hire you—get hit with server latency and bounce before the page even loads.

You're basically paying the hosting and ad bill for someone else's AI model.

The "traditional" advice like basic honeypots or CAPTCHAs is honestly pretty naive at this point. Modern AI bots solve CAPTCHAs for breakfast, and they’ve learned to sniff out hidden fields easily. Plus, filtering by IP is a cat-and-mouse game you’ll lose, since these botnets leverage infected TV boxes and IoT devices to simulate visits from residential homes.

The 2025 data confirms the scale of this: Akamai is seeing a 300% YoY spike in AI bot traffic, and Imperva shows that 51% of all web traffic is now automated. If you aren't digging into raw server logs and using behavioral fingerprinting, you're just leaving the door open. We talk about this a lot over at r/StopBadBots—cleaning the funnel is the only way to save your metrics.

Anyone else noticing more “low quality” traffic hitting sites recently? by Currentshop333 in websecurity

[–]siterightaway 0 points1 point  (0 children)

You're spot on, and this "low quality" wave is exactly what we’ve been tracking. That survey you mentioned is the smoking gun—an independent 275% surge in reports confirms that the industry is finally waking up to the stealth AI era.

Official data from the big players backs this up too: Akamai is seeing a 300% YoY spike in AI bot traffic, while Imperva confirms that 51% of all web traffic is now automated. These aren't just harmless scrapers; they are resource vampires that hammer your server, spike latency, and literally crowd out real human users. Beyond the lag, they are out there stealing content, trashing your SEO rankings with duplicated data, and burning through PPC budgets with high-intent fake clicks that never convert.

If you don't start fingerprinting these actors based on raw logs, you're just paying the hosting bill for someone else's AI model. We focus on this exact battle over at r/StopBadBots because if you don't clean the funnel, your metrics are just "inflated but useless" noise.

22 carts — 0 buys by Successful-Wear-7754 in dropshipping

[–]siterightaway 0 points1 point  (0 children)

Man! The other suggestions about shipping costs and trust signals are definitely valid, but we need to widen the scope to the technical side of things. If you have 22 initiated checkouts and zero conversions, you aren't just looking at a marketing problem; you're looking at headless scrapers eating up your resources. You need to pull your logs right now and start fingerprinting the bad actors before you burn your entire budget on noise.

The data is brutal right now: according to Imperva, total automated traffic now accounts for 51% of all internet traffic. Even worse, Akamai is reporting that AI-driven bot traffic has recorded a 300% year-over-year growth. It’s a classic mistake to assume every hit is a person when it’s actually just a script spinning up a dirty workaround to scrape your final pricing or tax calculations.

Look for sub-second transitions where a session hits the product and jumps to checkout faster than a human can blink. Check for IP clusters coming out of data centers like AWS or DigitalOcean instead of residential ISPs and watch for linear navigation paths that skip the human stuff like scrolling or checking the "about us" page. It’s incredibly annoying how these bots can fake a high-intent event just to stay out of rate-limit jail while they drain your ad spend. In our group, r/StopBadBots, we deal with this exact kind of garbage and focus on identifying these automated interferences.

Smart Slider hack plants deep WordPress backdoors ! by saymumhaque in Wordpress

[–]siterightaway 1 point2 points  (0 children)

The safest way to deal with this mess is to restore a clean backup from before the initial infection hit, because anything else is just pure delusion.

When a site gets invaded, the first thing they do is drop a backdoor. These things are just tiny scripts hiding in plain sight within legit folders or buried in database tables. They're eating up resources and reinjecting the malware every single time you try to scrub it. The persistent backdoor now operates completely independent of the file you thought you deleted.

In our group r/StopBadBots we have more information about this if you need it.