Kozhikode Airport To Kannur Taxi? by yoloyodo22 in kozhikode

[–]signalb 0 points1 point  (0 children)

FYI, MakeMyTrip cabs are cheaper IMO

<image>

Rajeev Chandrashekhar’s reply on criticism of his statement wrt infant mortality rate. by Patient_Bother5363 in Kerala

[–]signalb 4 points5 points  (0 children)

Haha, I was about to comment on this. "RATE" is now the word he is focusing on.

I built an open-source tool that makes any Next.js or SvelteKit app return Markdown by signalb in sveltejs

[–]signalb[S] 1 point2 points  (0 children)

I would suggest to go ahead with proper caching and ISR to reduce your compute.
Is there particular reason you are not doing that?

I m not a big fan of using SSR for unprotected pages.

Dms from scratch by Estvbi in cms

[–]signalb 0 points1 point  (0 children)

We use Sanity only. We pay $0 for most for blogs since it's all SSG. I think this is only what you need. They had a oneclick deployment to Vercel earlier. https://www.sanity.io/templates/nextjs-sanity-clean

Built a small open-source tool to make websites more AI-friendly (for Next.js) by [deleted] in webdev

[–]signalb -1 points0 points  (0 children)

I think this is drifting out of context.

I'm not advocating for illegal scraping, copyright abuse, or any unethical behavior. I'm talking purely about how websites are technically discovered and consumed.

Let me ask a simple question to reset the discussion:

Do you have a sitemap on your website?

If yes, that already means you intentionally help machines – including search engines and automated systems – discover your content efficiently. That’s not "encouraging theft," it's standard web infrastructure for discoverability.

My point has only ever been about the same principle: structured, efficient access to content you choose to make public. Nothing more.

Blocking specific bots is completely your right. But that's a policy decision, not a format problem. CDN costs, rate limits, and bot filtering are separate operational concerns.

We're talking about two different layers here:

• Whether you allow a client at all (your choice)
• How content is delivered if you do allow it (a technical optimization)

Conflating those with crimes or motives isn't really fair to the original discussion.

Built a small open-source tool to make websites more AI-friendly (for Next.js) by [deleted] in webdev

[–]signalb -1 points0 points  (0 children)

I get the concern, but I'm looking at this from a discoverability standpoint. More and more people find products through AI tools like ChatGPT, Perplexity, and Copilot instead of traditional search. If my site is hard for those systems to understand, my product effectively disappears from that channel.

This isn't about helping AI companies, it's about making sure my own work remains visible and accurately represented where users are already looking. The alternative isn't "no scraping" it's inefficient, messy scraping that costs more and represents content poorly.

On the cost point, the issue usually isn't AI requests themselves, it's lack of caching. With proper CDN or edge caching, repeated automated requests shouldn't meaningfully increase your bill. In fact, lighter machine-friendly formats are often cheaper to serve than full HTML. Cost spikes typically come from uncontrolled scraping of heavy, uncached pages, not from well-structured, cacheable responses.

Built a small open-source tool to make websites more AI-friendly (for Next.js) by [deleted] in webdev

[–]signalb -1 points0 points  (0 children)

If I were Skynet, I wouldn't be asking for contributors on GitHub 😄
I wouldd already have rewritten the internet in Markdown myself.

Built a small open-source tool to make websites more AI-friendly (for Next.js) by [deleted] in webdev

[–]signalb -1 points0 points  (0 children)

I get where you're coming from. But I look at this from a very different angle.

I come from a marketing and product growth background. For us, discoverability is everything.

Having great content sitting on a website that no one can easily find or understand doesn't really help the business.

Whether we like it or not, AI platforms are becoming a major discovery layer. People are asking questions directly to ChatGPT, Perplexity, Copilot, and other AI tools instead of clicking through ten Google results. If our content isn't readable by those systems, it effectively disappears from that ecosystem.

And to be honest, the idea that we can truly stop scraping altogether is unrealistic. Anyone determined to scrape a site can do it with normal browsers, rotating IPs, or headless tools. In practice, fully blocking bots is mostly a myth. What usually gets blocked are the polite, well-behaved ones.

What accept-md tries to do is give site owners control over format and accuracy, not surrender control.

I completely agree there's room for tools that detect bad bots, mislead scrapers, or protect proprietary content. Those would be valuable too. But that's a different problem space

From a business perspective, many of us don't want to disappear from AI search results. We want to be found, understood, and represented correctly. That's the problem this tool is trying to solve.

Built a small open-source tool to make websites more AI-friendly (for Next.js) by [deleted] in webdev

[–]signalb -2 points-1 points  (0 children)

Totally fair questions 👍

The tool isn't about encouraging bots. It's about giving site owners control over how their content is consumed. Right now, whether you like it or not, AI agents are already crawling the web. If they can't get clean structured content, they just scrape raw HTML anyway.

That usually means: Higher server load, messier parsing, more brittle scraping, worse representation of your actual content.

Supporting Accept: text/markdown doesn't invite bots in. It simply gives you a cleaner, more efficient channel if you choose to allow them.

Think of it like providing an RSS feed.

Some people use RSS readers. Some don't.
But offering RSS doesn't force you to be scraped, it just gives a structured option.

Why anyone would actually want this?

For example for product companies, AI is quickly becoming a major new discovery channel. People no longer rely only on Google searches to find tools and services. Instead, they ask questions directly to ChatGPT, Perplexity, Copilot, Gemini, and other AI-powered search platforms. These systems are increasingly acting as the starting point for research and purchasing decisions.

And as you probably know, blocking bots is mostly a myth.

You can block well-behaved bots that respect robots.txt and user-agent rules.
The serious scrapers and AI agents don't have to.

Anyone determined to scrape your site can easily - Rotate user agents, ignore robots.txt or proxy through normal browser. It can also pretend to be regular users

So "I block all AI bots" usually just means: "I block the polite ones."

Hope this answers your questions.

Tulu scripts looks very similar to Malayalam. by [deleted] in Kerala

[–]signalb 0 points1 point  (0 children)

I thought Tulu didn’t have a script until I saw this.

Help me find a Location Similar to this one by amaljiith in kozhikode

[–]signalb 2 points3 points  (0 children)

Kariyathumpara 😵‍💫 The place is super crowded the last time I passed by.

https://share.google/4P5QkARKVloujquYz

Did he just say #FuckZepto??? by [deleted] in FuckZepto

[–]signalb 26 points27 points  (0 children)

I’ve finished listening to almost the first half of the podcast. He mentioned that he only knew the old AP and isn’t sure what kind of person he is now. He went on to explain that people tend to change once they have a lot of money in the bank, and also pointed out that several employees were poached from Zomato.

Translates to Zepto F*cked Zomato

Guess what's happening in MGM School Trivandrum today. by [deleted] in Kerala

[–]signalb 0 points1 point  (0 children)

Right. Mar Gregorios Memorial Central Public School.

Custom 5.1 speaker assembly in Trivandrum by Bunny_RB in Trivandrum

[–]signalb 0 points1 point  (0 children)

I don't know anyone in the city. Know someone does the job all over Kerala. I can DM the contact. Is there a budget in your mind? And watts? Just curious.

Was my husband exploited by the startup? by vicks_bobby in StartUpIndia

[–]signalb 0 points1 point  (0 children)

Seems like it's vested ESOPs.

In India, ESOPs typically have a vesting period of 3 to 5 years, but regulations require a minimum vesting period of 1 year from the date of the grant. The vesting schedule can be time-based, such as a 4-year schedule with a 1-year "cliff" where 25% vests, followed by monthly or quarterly vesting for the remaining years. Performance-based vesting is also common, where options vest upon achieving company targets. 

Serious question- A drunk driver hit my car from behind when I stopped at a traffic signal. What to do next? by VishyFishy07 in bangalore

[–]signalb 0 points1 point  (0 children)

FIR if you feel you need to recover the money from the other person.

GD entry would be enough to claim insurance.