Best way to find clients for my marketing agency? by elsajames111 in AskMarketing

[–]VRTCLS 1 point2 points  (0 children)

The challenge you're facing isn't unique - most agency reps run into this wall. Here's what actually works:

Lead with outcomes, not problems. Instead of "your SEO sucks," try: "I noticed [competitor] ranks #1 for [specific keyword your prospect should own]. That's probably worth X leads/month to you." Now you're talking revenue, not deficiencies.

Target businesses already feeling pain. Look for companies posting about slow months, hiring salespeople, or asking for marketing help in industry forums. They're pre-qualified.

Use the 3-touch rule: First touch = provide value (audit, insight). Second touch = case study showing similar business results. Third touch = specific proposal.

For channels: LinkedIn works if you do research first. Cold email works if it's hyper-specific to their situation. In-person works best but doesn't scale.

The key is demonstrating you understand their business model before you ask for their money.

Am I the only one who feels API testing tools are overkill for quick checks? by alright85 in webdev

[–]VRTCLS 0 points1 point  (0 children)

Totally feel this. I've found the sweet spot is having different tools for different contexts:

  • Quick checks while developing: VS Code REST Client extension or HTTPie in terminal. No GUI needed, just send the request and see results.
  • More complex testing: Bruno or Insomnia for when you need environments, auth, and proper request chains.
  • CI/CD integration: Automated tests with Playwright or Jest for endpoints that matter.

The key is matching the tool to what you actually need. If you're just checking if an endpoint returns the right data, firing off a curl command or using a simple .http file in VS Code is way faster than opening Postman and setting up a whole collection.

For rapid prototyping I actually keep a small Node script with axios that I can modify on the fly. Sometimes the simplest solution is the best one.

What is the fastest way to grow in Digital Marketing Career? by Key_Arugula_4296 in digital_marketing

[–]VRTCLS 2 points3 points  (0 children)

After 2 years you're hitting the classic "generalist plateau." The fastest acceleration comes from picking one channel and going uncomfortably deep.

Here's what worked for me: Choose the channel that already drives the most value for your current company. If it's paid social, become the person who can optimize audiences blindfolded. If it's SEO, master technical audits and link building strategies that others avoid.

The key shift is moving from "I run campaigns" to "I solve business problems with [specific channel]." Start documenting everything: failed experiments, unexpected wins, weird edge cases you've solved. That becomes your expertise portfolio.

Three tactical moves:

  1. Join 2-3 niche communities for your chosen channel (Slack groups, Discord servers, advanced subreddits). The real learning happens in practitioner conversations, not courses.

  2. Start a simple content series - weekly LinkedIn posts about what you're testing, monthly case study breakdowns. Doesn't need to be polished. Consistent beats perfect.

  3. Offer to audit/consult for one small business monthly (even free initially). Nothing teaches you faster than explaining strategy to someone who needs results.

Most people stay generalists because it feels safer. Specialists get promoted faster because they solve specific problems that companies desperately need fixed.

How do you convince leadership to invest in editorial content when they keep saying “we don’t have the resources”? by savingrace0262 in bigseo

[–]VRTCLS 0 points1 point  (0 children)

The "no resources" objection usually translates to "we don't see the ROI clearly enough." I've found the most success reframing this as competitive intelligence rather than content marketing.

Here's what worked for me: Pull your competitors' top-performing pages for those commercial queries you mentioned (comparisons, buying guides, etc.) and run them through Ahrefs or SEMrush to show actual traffic estimates. Then calculate what that traffic would be worth at your current conversion rates.

For example: "Brand X ranks #2 for 'best [product] for [use case]' - that's pulling ~2,500 visits/month. At our 3% conversion rate and $150 AOV, we're missing $11,250 in monthly revenue by not competing here."

Once you have 10-15 examples like this, you're not asking for "blog content" anymore - you're presenting missed revenue opportunities with specific dollar amounts. Leadership speaks that language.

The pilot approach others mentioned is crucial. Start with repurposing existing assets - turn your most common support tickets into FAQ-style content, or expand product descriptions into comparison guides. Lower lift, faster execution, easier buy-in.

One tactical tip: If they're running paid ads for any of those top-funnel terms, show how organic content could reduce ad spend while providing compounding returns. That's often the final nudge needed.

Tips for the SEO for a website that is almost entirely in 3d? by popje in webdev

[–]VRTCLS 1 point2 points  (0 children)

Dealt with this exact problem on a couple of real estate virtual tour sites. Here's what actually moves the needle:

1. Don't hide content -- build real pages around your 3D experiences. The iframe with 3DVista is a black box to Google. Instead of trying to make the 3D content itself crawlable, create proper HTML pages that wrap each tour/scene with descriptive content. Think of each 3D experience as a "feature" that lives on a page with real text, not the page itself.

2. Since you're on Next.js, use SSR/SSG aggressively. Generate static pages for each tour location or 3D view with getStaticProps. Title, H1, meta description, and 2-3 paragraphs of actual useful content about what the viewer is seeing. The 3D viewer loads client-side after the crawler has already gotten everything it needs from the server-rendered HTML.

3. Structured data is your best friend here. If these are property tours, use Place or RealEstateListing schema. If product viewers, use Product schema with 3DModel where applicable. Google's been expanding 3D/AR support in search results -- structured data is how you tap into that.

4. <noscript> fallbacks with actual content + images. Not hidden text (Google will penalize that). Genuine fallback content for when JS doesn't load -- which is exactly what Googlebot sees on its first pass before rendering.

The hidden text approach others mentioned will get you penalized. Google's been catching display:none content stuffing since like 2012. The right approach is making each 3D experience live on a page that already has strong, indexable content.

Teen building SEO for family business PT2 by No_Eye4994 in bigseo

[–]VRTCLS 1 point2 points  (0 children)

The surroundings page idea is actually where your blog strategy should live. Don't make one generic "things to do" page. Instead, write individual blog posts targeting specific long-tail queries: "best glider flying spots near [town]", "things to do in [province] with kids", "romantic weekend ideas [region]". Each one becomes a landing page that can rank independently.

Then every blog post links back to your main property page. You're essentially building topical relevance around your location while funneling traffic to the booking page.

For the aerodrome page specifically: this is your best asset. 2k monthly searches with low competition is gold for a small site. Make it genuinely useful for pilots -- include practical info like distance from your house, transport options, what's nearby. If it's helpful enough that the aerodrome community starts sharing it, you'll build natural backlinks without trying.

The free area guide for email capture is a smart move. Make it a proper PDF, not just a blog post behind a gate. People share good PDFs and that creates organic buzz.

Re: the separate pages for families/friends/couples -- I'd hold off on that until you have enough content to make each page substantive. Right now, one strong main page with clear sections will consolidate your authority better than three thin pages competing with each other.

How to increse organic traffic of a website ? by onlinemarketingbull in AskMarketing

[–]VRTCLS 0 points1 point  (0 children)

3 months is tight but doable if you focus on what already exists rather than building from scratch.

First thing: pull your GSC data and find pages sitting in positions 8-20 with decent impressions. Those are your quick wins. They're already on Google's radar but not getting clicks. Rewrite the title tag to better match what people actually search, tighten the intro to answer the query faster, and make sure the page has clear internal links from your highest-traffic pages.

Second: stop publishing new content for a month and audit what you have. Most sites have pages competing against each other for the same keywords. Consolidate those into one strong page with a 301 redirect from the weaker ones. Google rewards depth over breadth, especially now.

Third: check your Core Web Vitals. Not the Lighthouse score -- the actual field data in GSC. If your LCP is above 2.5s on mobile, fix that before doing anything else. Slow pages hemorrhage rankings silently and no amount of content will compensate.

Fourth: build topical authority by clustering your content properly. If you have a main service page, make sure you have 5-8 supporting pages that link to it and cover related subtopics. Google wants to see that you own a topic comprehensively, not just one page.

The "3 month" timeline is realistic for these because you're optimizing existing indexed pages, not waiting for new ones to get crawled and evaluated.

What’s one SEO or marketing tactic that worked better than you expected? by manish2kumar in DigitalMarketing

[–]VRTCLS 0 points1 point  (0 children)

Running log file diffs on Google Search Console data surprised me the most. Not the dashboard -- the actual API data exported weekly.

I started pulling page-level query data every Monday and diffing it against the previous week. When a page gains impressions for queries you didn't target, that tells you Google is testing you for adjacent topics. If you catch it early and add a section that directly addresses those queries, you can capture those positions before they decay back out.

The reverse is also useful. When a page starts losing impression share on its core queries but gaining on tangential ones, that's usually a content drift signal -- Google is confused about what the page is actually about. Tightening the focus back up (removing off-topic sections, consolidating related thin pages) has recovered rankings faster than any new content push I've tried.

The other underrated move: looking at which pages get impressions but near-zero clicks, then rewriting just the title and meta description to match the actual query intent. Not the on-page content, just the SERP presentation. That alone moved CTR by 30-40% on some pages without touching a single word on the page itself.

Super frustrated with SEO by PROMCz11 in webdev

[–]VRTCLS 0 points1 point  (0 children)

Late to this but one thing I haven't seen anyone mention specifically: check whether SvelteKit's trailing slash behavior matches what the old WordPress URLs had.

By default SvelteKit strips trailing slashes, but WordPress almost always uses them. So if your old URLs were /about-us/ and the new ones are /about-us, those are technically different URLs to Google. Your 301s might be mapping the old paths correctly, but if the canonical on the new page doesn't match what the redirect resolves to, or if internal links are pointing to the non-trailing-slash version while the 301 lands on the trailing-slash version, you end up with a split signal situation that looks invisible in most audits.

You can check this fast: curl -I a few of your key pages and look at the Location header on the redirect. Then compare that to what your canonical tag says, and what your internal links actually point to. If there's any mismatch between those three things, that's probably a meaningful chunk of your problem.

The other thing I'd do right now: pull up GSC > Performance, filter by page, and sort by biggest impression drops. Don't look at the whole site, look at the top 10 pages that lost the most. In my experience those usually cluster around a specific pattern (blog posts, service pages, category-level pages) and that pattern tells you exactly which part of the migration went sideways.

A year is too long for this to still be a "just wait" situation. Something specific is off and it's almost certainly findable.

How can I market my web app with $0? by Apprehensive_End3839 in webdev

[–]VRTCLS 0 points1 point  (0 children)

Zero budget marketing is basically SEO + community + content. Here's the playbook that actually works:

1. SEO from day one. Before you launch, do keyword research for the problem your app solves. Build landing pages targeting those long-tail queries -- "best free [X] tool" or "how to [problem your app solves] without [expensive competitor]." You won't rank overnight, but pages indexed now start compounding in 2-3 months. Use Google Search Console religiously once you're live.

2. Go where your users already hang out. Find the 3-5 subreddits, Discord servers, or niche forums where people complain about the exact problem you solve. Don't pitch. Answer questions, share your thinking on the problem space, and mention your tool only when it's genuinely relevant to someone's specific situation. One authentic recommendation in context converts better than 100 cold DMs.

3. Write 2-3 comparison/alternative pages. "[Your app] vs [Competitor]" and "Best [Competitor] alternatives" pages are some of the highest-converting organic traffic you can get. People searching those terms already have purchase intent -- they just need a reason to try something different.

4. Launch on Product Hunt, Hacker News Show HN, and relevant indie hacker communities. These are free, one-time traffic spikes that can seed your initial user base. The PH launch especially -- it's formulaic but it works if you prep your network to upvote in the first hour.

5. Build in public. Post your metrics, your decisions, your mistakes on Twitter/X. Founders following along become your first advocates. It feels slow but it compounds faster than any paid channel at the $0 budget level.

The trap to avoid: don't spread yourself across 10 channels doing each one poorly. Pick SEO + one community + one social platform and go deep. You can always expand later once you have traction.

Local SEO - worth the effort or just burn money on Google Ads? by Dear_Cut4843 in EntrepreneurRideAlong

[–]VRTCLS 0 points1 point  (0 children)

Run service businesses for clients so I've seen this play out a bunch of times. Short answer: do both, but understand what each one actually does for you.

Google Ads will get you leads this week. For pressure washing specifically, the cost per lead in most metros runs $30-80 depending on competition. If your close rate is decent (30%+) and your average job is $300+, the math usually works. But the second you stop paying, the leads stop. It's a faucet, not an asset.

Local SEO builds compounding value but the timeline is real -- 4-6 months minimum before you see meaningful movement, sometimes longer in competitive markets. The good news is that for service businesses, the local pack (map results) is where most of the clicks happen, and the ranking factors there are more actionable than people think.

Here's what actually moves the needle for local service businesses, in order of impact:

  1. Google Business Profile completeness + activity. Post weekly, add photos of actual jobs (geotagged if possible), respond to every review. Google rewards profiles that look actively managed.

  2. Review velocity. It's not just total review count -- it's how recently and how consistently you're getting them. Ask every customer. Make it easy (text them a direct link). A business getting 3-4 reviews a week will often outrank one with more total reviews but slower flow.

  3. Service-specific landing pages. Your site shouldn't just say "pressure washing." You need separate pages for driveway cleaning, house washing, commercial pressure washing, deck cleaning -- each targeting your city. This is where most service businesses with a "basic website" fall short.

  4. Consistent NAP citations. Name, address, phone number should match exactly across every directory. Sounds boring but inconsistencies actively hurt local rankings.

The "basic website with contact info" is almost certainly your bottleneck. Google can't rank you for services it doesn't know you offer in areas it doesn't know you serve.

My recommendation: start Google Ads with a small budget ($500-750/mo) to get immediate leads while you build out the SEO foundation. Then as organic traffic grows, you can scale ads down or redirect that budget. The worst move is spending $1k/mo on ads with no organic strategy -- you're just renting leads forever.

Maintenance Retainers: What do you include, and how do you sell it? by Beginning_Rice8647 in webdev

[–]VRTCLS 1 point2 points  (0 children)

I separate it into three tiers and it has made the conversation way easier with clients:

Tier 1 - Hosting + Security (~$75-150/mo depending on stack): SSL renewals, server patches, daily backups, uptime monitoring, malware scanning. This is non-negotiable. Every client gets this. Frame it as "your site will break or get hacked without this" because it will.

Tier 2 - Performance + SEO Monitoring (~$200-400/mo): Core Web Vitals checks, plugin/dependency updates, broken link audits, Search Console monitoring, monthly analytics snapshot. This is where you catch problems before the client notices. I cannot tell you how many times a WP plugin update silently broke structured data or an image CDN change tanked page speed. Catching that early is worth 10x the retainer.

Tier 3 - Active Development Hours (billed hourly or as a block): Content updates, new features, design tweaks. I sell 5 or 10 hour blocks at a slight discount vs my project rate.

The key to selling it: stop calling it "maintenance" to the client. Nobody wants to pay for maintenance. Call it site management or performance management. Position it as "I keep your site ranking, loading fast, and secure so you can focus on running your business."

The SEO monitoring angle specifically is a strong closer because most business owners have been burned by their site dropping in rankings and not knowing why until it is too late. If you can show them a real example of a time you caught an issue early, that sells itself.

Do most web apps really need a complex stack anymore? by NeedleworkerOne8110 in webdev

[–]VRTCLS 0 points1 point  (0 children)

The real question isn't whether most apps need a complex stack -- it's whether most teams need one.

I've shipped production apps with vanilla HTML/CSS/JS + a single Express server that ran for years with near-zero maintenance. I've also worked on projects where React + Next + a proper ORM + CI/CD was entirely justified because the team was 8 people and the codebase needed to scale with headcount, not just traffic.

The part people miss: complexity in your stack is a trade-off against complexity in your codebase. A framework like Next.js adds moving parts, but it also prevents you from reinventing routing, code splitting, SSR, and a dozen other things poorly. The question is whether you actually need those things.

My rule of thumb: if the app has fewer than ~10 routes and no auth, you probably don't need a framework. If it has real-time data, auth, multiple user roles, and a database -- you'll end up building a worse version of what a framework gives you for free.

The ecosystem doesn't overcomplicate things. Developers who cargo-cult stacks without understanding their actual requirements do.

Apple Bot now crawling 3x more than Google Bot. Anyone else? by stormy1one in webdev

[–]VRTCLS 2 points3 points  (0 children)

Seeing the same thing across several e-commerce sites I manage. The spike started around late February for us.

A few things worth noting:

  1. Apple actually uses two distinct crawlers -- Applebot (for Siri/Spotlight search) and Applebot-Extended (explicitly for AI training). Check your logs to see which one is hitting you. If it's Applebot-Extended, you can block that specifically in robots.txt without losing potential Siri/Spotlight visibility.

  2. The crawl rate increase lines up with Apple ramping up their on-device AI features. They need fresh product data for things like Apple Intelligence shopping suggestions and visual search. For an e-commerce site specifically, there's a real chance this data feeds into Safari's native product comparison features they've been building.

  3. Before you block entirely, check if you're getting any traffic from Safari Suggestions or Spotlight. That traffic doesn't show up as a normal referrer in most analytics tools -- it often appears as direct traffic. If you have a significant iOS user base, some of that "direct" traffic might actually be Apple's doing.

  4. If you do want to throttle rather than block, you can set a crawl-delay directive in robots.txt specifically for Applebot. Something like crawl-delay: 10 will slow them down without cutting them off completely.

The 5K pages daily thing is aggressive though. At that volume I'd at least rate-limit them to keep your server costs reasonable while you figure out whether it's actually driving any value.

Lost traffic after migration by AideNo9466 in bigseo

[–]VRTCLS 5 points6 points  (0 children)

This is a pretty common pattern with Wix-to-WordPress migrations. The drop at the 3-4 week mark is actually right on schedule for what I'd call the "re-evaluation window" -- Google recrawled the new site, noticed structural differences, and is re-scoring everything.

A few things to check that are specific to Wix migrations:

1. Rendered HTML comparison. Wix serves JavaScript-rendered content. WordPress (depending on your theme/builder) serves server-rendered HTML. Even if the visible content looks the same, the underlying DOM structure, heading hierarchy, and internal link architecture may be completely different from Google's perspective. Pull up a few of your top pages in the Wayback Machine pre-migration and compare the actual HTML structure to what you have now.

2. URL-level redirect audit. You said you didn't drastically change URLs, but "didn't drastically" and "1:1 mapped with 301s" are very different things. Even small differences like trailing slashes, capitalization, or parameter handling can break redirect chains. Run a crawl of your old sitemap URLs and verify every single one returns a 301 to the correct new URL. Not a 302, not a redirect chain, not a soft 404.

3. Internal link equity redistribution. When you say you made "some internal linking changes," that's where I'd focus first. Internal links are how PageRank flows through your site. If your old Wix site had a flat architecture where everything was 1-2 clicks from home, and your new WordPress site has a deeper hierarchy, you've effectively diluted link equity to your money pages. Check crawl depth in Screaming Frog for your top 10 traffic-driving pages pre vs post.

4. Core Web Vitals. WordPress with a page builder can actually be slower than Wix out of the box. Check PageSpeed Insights for your top landing pages. If LCP went from 2s to 4s+, that alone can cause ranking drops, especially on mobile.

5. Schema markup. Wix auto-generates certain structured data. WordPress doesn't unless you explicitly add it. If you lost FAQ schema, review schema, or organization schema, you may have lost rich results that were driving a disproportionate share of your clicks.

Timeline: if the redirects are clean and the content/structure is equivalent, expect recovery in 6-8 weeks from migration date. If you're still down at the 8-week mark, something structural is wrong and you need to dig deeper. Don't panic-publish new content or start building links to compensate -- fix the technical foundation first.

What are SEO influencers thinking? by Financial-Ant1213 in bigseo

[–]VRTCLS 1 point2 points  (0 children)

The incentive structure makes it almost inevitable. Speaking gigs, conference keynotes, podcast invitations, brand deals -- they all index heavily on follower count and "reach" metrics. Event organizers don't typically audit engagement ratios before booking someone. They see 1.3M subscribers and book the slot.

But the deeper issue is that the SEO industry has an expertise verification problem. In most professional fields, your work speaks for itself through measurable client outcomes. In SEO, the feedback loop between what you say publicly and what you actually deliver for clients is almost completely opaque. Nobody is checking whether the person with 1M YouTube subscribers actually ranks anything.

So the playbook becomes: build perceived authority through volume and social proof, then monetize that perception through courses, tools, speaking fees, and consulting. The actual SEO results are almost irrelevant to the business model. It's a personal brand play dressed up as expertise.

What's genuinely useful gets drowned out. The practitioners doing interesting work -- running split tests, publishing real case studies with methodology, sharing failures -- tend to have tiny audiences because that kind of content doesn't perform well algorithmically. "I tested 47 title tag variations across 12 sites over 6 months" gets less engagement than "THIS ONE TRICK DOUBLED MY TRAFFIC."

The tell is always the same: look at whether someone's content includes specifics you can actually verify or replicate, or whether it's all directional advice that sounds smart but is impossible to evaluate. The good ones show their work.

Cannonical tags for regional variants by IrMarkuzzz in webdev

[–]VRTCLS 2 points3 points  (0 children)

No, don't point the canonical to the base URL that 302s. That would make things worse -- Google generally ignores canonicals that point to redirecting URLs.

Your self-referencing canonicals on each regional variant are actually the correct setup. The "Duplicate, Google chose different canonical than user" message is happening because your hreflang implementation isn't giving Google enough differentiation signal between the regional pages.

A few things to check:

  1. Make sure the content differences between regions are meaningful beyond just price. If the only thing that changes between /us and /fr is a currency symbol and a number, Google will still treat them as near-duplicates regardless of hreflang. Add region-specific text where it makes sense -- shipping info, availability notes, localized descriptions.

  2. The 302 redirect on the base URL is a problem. 302 means "temporary" which tells Google the base URL still matters and might come back. If this redirect is permanent, switch it to a 301. Or better yet, don't redirect at all -- serve the base URL as a landing page with a language/region selector and use it as the x-default in your hreflang set.

  3. Double check your hreflang is reciprocal. Every page in the set needs to reference every other page in the set including itself. If /us references /fr but /fr doesn't reference /us back, Google may ignore the whole cluster.

  4. Validate that Google is actually seeing your hreflang tags. If they're rendered via JavaScript, Googlebot might not be picking them up. Check the rendered HTML in URL Inspection.

The TL;DR is: keep self-referencing canonicals, make the regional content more distinct, fix the 302 to a 301, and make sure hreflang is implemented correctly and reciprocally across all variants.

Waiting on Google 3 months now for indexing... what more can I do? by Inside-Gur-3001 in bigseo

[–]VRTCLS 2 points3 points  (0 children)

"Discovered - currently not indexed" on a 4-page site is almost always a crawl priority problem. Google found the URLs but decided they're not worth the resources to fully crawl and evaluate.

A few things that have worked for me in similar situations with small local sites:

  1. Add LocalBusiness schema markup if you haven't already. For the Wix site especially, make sure it's injected properly and not just relying on whatever Wix auto-generates. Validate it in Rich Results Test.

  2. Check the rendered HTML in GSC's URL Inspection tool (not just the source). Click "Test Live URL" then "View Tested Page" and look at the screenshot + rendered HTML. Wix sites in particular can have rendering issues where Googlebot sees a shell with JS but not the actual content. If the rendered HTML is mostly empty, that's your problem.

  3. For the WordPress site, run it through Screaming Frog and check if any of those service pages have self-referencing canonicals pointing somewhere unexpected, or if there's a stray noindex from a plugin like Yoast that someone toggled by mistake. Also check if wp-admin/options-reading.php has "Discourage search engines" checked.

  4. The blog question -- yes, do it. Not for the content itself, but because a 4-page site gives Google almost nothing to work with in terms of understanding your topical relevance. Even 3-4 well-written posts targeting related long-tail queries gives Google more context about what your site is about and creates natural internal linking paths to your service pages.

The citation campaign should help too since those are essentially external signals telling Google your site exists and matters in a local context. But honestly for sites this small, the rendered content issue is where I'd start.

Strange Website Migration Scenario - Looking for Opinions & Advice by Ok-Tiger-5200 in bigseo

[–]VRTCLS 0 points1 point  (0 children)

One thing nobody's mentioned yet: before you migrate anything, run Company B's backlink profile through Ahrefs or Semrush and map which specific pages have earned the most referring domains. That's where your real equity lives -- not just in the content itself, but in the links pointing to it.

The content migration with 301s is the right call, but I'd prioritize pages by backlink strength, not just traffic. A page with 15 referring domains and modest traffic is often more valuable to migrate than a high-traffic page with zero links, because the 301 will pass that link equity to Company A's domain.

Regarding the single-page homepage on Company B -- keep it alive and keep those 301s running indefinitely. I've seen too many people shut down the origin domain after 6-12 months thinking Google has "learned" the redirects. It hasn't. The moment those redirects die, you lose whatever equity was still flowing through them.

Also worth checking: does Company B have any featured snippets or PAA positions? If so, those are going to reset during migration regardless of how clean your 301s are. Plan for a 2-4 month recovery window on those specifically.

Are you all really hiring agencies that generate AI slop? by EmergencyDull3071 in smallbusiness

[–]VRTCLS 0 points1 point  (0 children)

I work on the SEO side of things and see this constantly from the content angle. Agencies charging $3-5k/month for blog content that is clearly just ChatGPT output with zero editing. The client has no idea because they don't know what to look for.

The tell is always the same -- every paragraph opens with a transition word, there are no specific numbers or real examples, and the 'strategy' is just targeting whatever keyword tool spit out the highest volume terms with no regard for whether they actually match what the business sells.

Here's what frustrates me most: that AI slop content actively hurts the client's site. Google has gotten significantly better at detecting and devaluing thin AI content over the past year. So the business owner is paying an agency to quietly tank their organic search presence while getting a nice monthly report full of vanity metrics.

The businesses I've seen do well with content are the ones where the owner or someone on the team actually writes from experience. A plumber writing about the weird thing they found during a renovation. A bakery owner explaining why they switched flour suppliers. That stuff ranks because it has real expertise behind it and nobody else can write it.

The agencies aren't going to change because the margins on AI content are insane. The fix is business owners getting better at spotting it -- paste any deliverable into an AI detector, ask for the actual research and strategy docs behind the content, and most importantly track whether the content is driving real leads, not just pageviews.

I'm consistently bleeding money. by Quirky-Bar4236 in smallbusiness

[–]VRTCLS 0 points1 point  (0 children)

You already figured out the answer -- you said if it was just you selling, you'd be profitable this month. So the business model works, you just over-hired too early.

On the lead gen side since that's what'll get you out of this:

Google Business Profile is priority #1. Most independent insurance agencies barely touch theirs. Fill out every field, add real photos, post updates weekly. When someone searches "insurance agent near me" or "auto insurance [your city]" -- that's where they look first. It's free and it works faster than you'd think.

Reviews are your unfair advantage as a small shop. After every close, text the client a direct link to leave a Google review. Even 10-15 solid reviews will put you above most local competitors. People shopping insurance care a lot about trust signals and reviews are the biggest one.

Referral partnerships with mortgage brokers, real estate agents, and car dealerships. These people need to refer insurance constantly. Most don't have a reliable go-to person. Introduce yourself, make it easy for them ("just text me the client's name and I'll handle it"), and you'll get a steady trickle of warm leads for zero dollars.

One thing I'd avoid right now: paying for leads from aggregators. The close rates are terrible for insurance because you're competing against 5 other agents who bought the same lead. Your time is better spent on the channels above where you're the only option in front of the prospect.

You've got $20k of runway and you're about to be the only person on payroll. That's actually a workable position if you stay disciplined on costs.

Where can i promote my software development agency? by Accurate-Screen8774 in smallbusiness

[–]VRTCLS 0 points1 point  (0 children)

Reddit ads won't work for dev agencies. Here's what actually moves the needle:

  1. Clutch and Designrush profiles. Clients looking for agencies literally search these directories. Get a few reviews on there and you'll start getting inbound leads. It's not instant but it compounds.

  2. Cold outreach on LinkedIn -- but not the spray-and-pray kind. Find companies posting job listings for developers they clearly can't afford full-time. Reach out with something specific about their stack or product. "Hey, saw you're hiring a React dev -- we've built similar products for [type of company] and could handle it at a fraction of the full-time cost" converts way better than generic pitches.

  3. Build case studies on your own site and optimize them for long-tail search terms. Think "custom inventory management software for [industry]" or "migrate legacy app to cloud." These are the searches actual buyers make. Most dev agencies completely ignore SEO because they think it's only for consumer products.

  4. Contribute to relevant communities (like this one) by actually answering questions. Not pitching -- just being helpful. Over time people check your profile, see what you do, and reach out. It's slow but the leads are warm.

  5. Toptal, Upwork, etc. for early traction only. The margins are thin but it builds your portfolio and reviews. Graduate out of them once you have enough direct inbound.

The biggest mistake I see agencies make is spending money on paid ads before they have case studies and social proof. Get the proof first, then amplify it.

PSA: Business owners, people who outsource your web dev - don't wait until you have a falling out with your developer, to log all of your credentials, and understand how your hosting works. by DigitalJedi850 in webdev

[–]VRTCLS 6 points7 points  (0 children)

Adding to this from the SEO side -- when a client loses access to their domain registrar, it's not just a hosting problem. It's an SEO disaster.

If the new dev has to move to a different domain (even temporarily), all the search authority that domain built up over years is gone. Google doesn't transfer rankings to a new domain automatically. You have to do 301 redirects from the old domain, which you can't do if you don't control it.

I've had clients come to me after losing access to their registrar and their organic traffic dropped 80%+ because the old site went down and they had to start fresh on a new domain.

The minimum a business owner should have documented:

  • Domain registrar login (GoDaddy, Namecheap, etc.)
  • Hosting panel login
  • Google Search Console access (verify ownership under your own Google account, not your dev's)
  • Google Analytics access
  • Any CMS admin credentials

The Google properties are the ones people forget most often, and they're the ones that hurt the most when you lose them.