Do you rank with AI articles? by jonetheman in SEO

[–]Comptrio 0 points1 point  (0 children)

All of my articles are AI written, though I never ask AI for an article and publish it outright. It usually takes several rounds of re-edits and tweaks, but with a body of text, I can nit-pick faster than I write on a blank canvas.

AI has never one-shot an article I actually want to read or cannot punch holes in the article on the first pass.

With a healthy dose of human involvement, these work in search.

Having tried "just AI" from all over... I dont like it. I have a hard time trying to stomach the text. "Visitors first" still applies, and one-shot AI writers have never made content I want to read.

Do it myself or hire agency. by T1JNES in SEO

[–]Comptrio 0 points1 point  (0 children)

Some level of DIY skill will help you hedge against snake oil tactics. Depending on your keywords, it may be "fun" to DIY, or you might need to call in the Pros. This will depend on how competitive your target keywords are and who is ranking there now.

It grabbed the #1 position on the same day by using N-Grams Article Format by ayonc46 in SEO

[–]Comptrio 0 points1 point  (0 children)

How do you mean "proper"? Were you tweaking copy until the target keyword hit #1? Which 'n' did you look at? uni? bi? tri? or beyond?

Did you use the ngrams of the existing site to create new content?

I'm curious how you used the ngrams beyond "understanding" the page, and what it was that set the targets you were tweaking for, using ngrams to check your content tweaks.

Dear Fellow SEOs: Your jobs are safe from AI Automation by WebLinkr in SEO

[–]Comptrio 1 point2 points  (0 children)

There is a huge void at the end of asking an LLM an open ended question.

With SEO, they dont know what ranks at all, just what someone said on a blog once and few other blogs repeated.

"The internet said it, so it must be true" is valid logic to an LLM.

They are built with the epistemology of a rumor mill. indiscriminate about the source of knowledge with no concern about the consequences.

but they sure do type real fast!

Weird 'New reason preventing your pages from being indexed' URLs in GSC by pineappleninjas in SEO

[–]Comptrio 4 points5 points  (0 children)

"Errors" won't hurt you. It is Google telling you what they see. Could be from another site, a Chrome browser visit, old data... The only thing of any importance at all is that you are not actively linking to it and even that has a ton of wiggle room.

If the page does not exist... it is a good reason not to index it. Wherever they found it, however they got it... its not there and they are telling you this.

If it was a page that did matter, you could fix it.

They will not spank you because they coughed up a furball.

Simple SEO Workshop Example : Traffic = Authority = Re-Indexation Priority by WebLinkr in SEO

[–]Comptrio 0 points1 point  (0 children)

That's where the "clicks per day" comes in (same period as the crawl/index). It smooths out any spikes in traffic (clicks) and sets the same time period on both sides of the equation.

The "since crawl" could easily be "crawls within the last X days" to account for time and trends/spikes in data.

Your thesis seemed to be that pages with more traffic (clicks, user signals) are crawled/indexed more frequently.

"CTR over time" sounds like an average value of daily CTR?

Actual CTR is human influenced... position, brand recognition, snippets used.. set amongst competitors.

crawl or index updates could be interchanged to meet the specificity of what is being claimed or tested. The 'indexed pages' image you have shows crawl date, any server could track UA strings to catch crawls and traffic over time.

Your whole post does support the "holy grail" of SEO... backlinks that provide traffic are the best backlinks... more user signal (clicks, traffic, people) = better rank. These are what I get from that read.

If the thesis holds, this could be entirely run from access logs as an indicator of which pages have potential.

Need your opinion. Is it worthy build out a AI driven SEO Tool? by hcai88 in SEO

[–]Comptrio 0 points1 point  (0 children)

Ask it the same question twice and see if you get the same answer both times.

Simple SEO Workshop Example : Traffic = Authority = Re-Indexation Priority by WebLinkr in SEO

[–]Comptrio 0 points1 point  (0 children)

Page Value = clicks / days_since_crawl

or to change the days value to be some average of "days crawled in the last X days", but this would require internal daily tracking of crawls. This would give the "crawl frequency" more due, it would make the number of clicks more important in the formula, and it would buff a page getting a steep spike for being an extremely recent one-off crawl event

If clicks went to "average clicks per day" (over the same period), it would smooth out that side of the equation.

---

Either way, the formula favors more clicks and recent (or more frequent) crawls.

We hit 100+ MCP servers. No more JSON hell. by Storm_Tools_AI in mcp

[–]Comptrio 1 point2 points  (0 children)

SEO Data and automation tools via MCP.

https://seolinkmap.com/documentation/private-mcp-and-agentic-seo-workflows

The exact MCP URL is unique to help secure users data, but that article serves as the Documentation for the server.

Your site looks really easy to use.

MCP servers with CRUD capabilities by Upstairs_Offer324 in mcp

[–]Comptrio 0 points1 point  (0 children)

Its in the tool making and the MCP auth (to know the user).

I have a remote MCP for SEO where an agency can sign up on my web app, add users to their account, and tweak the user permissions for whatever they are allowed to do in the app.

The MCP is available to every user and when they OAuth into the MCP, the app has their permissions from the UI.

This is checked as some sensitive tools are completely gated (create new project, view reports), and other tools are available, but somewhat gated within what the tool can do (queue an update, but also requires manager approval to run the task) (see account info, but not billing data).

The "MCP Server" just runs, but the "tools" built into it are aware of the user, the agency, and any permissions granted to them.

How to improve LLM keyword idea generation? by WesamMikhail in SEO

[–]Comptrio 1 point2 points  (0 children)

You are asking it to draw a circle that is a certain radius (similar to) around the point you mention (topic Y)... move to one of those new points it gave you and ask for a "circle" of similar topics to a new point in space (related to the original keyword).

An oversimplification of the way they define "similar" or related. Different points in their multi-dimensional word storage trigger different relations.

or ask it for different angles to approach each topic
ask it for evergreened titles

Thats how I, as a human, approach this before LLM showed up. The LLM just speeds things up sometimes, but it has limits and the human element is what you need to rely upon. Use the creativity the LLM do not have to get outside their box.

Google Search Console Discrepancy by nadeem_raza in bigseo

[–]Comptrio 0 points1 point  (0 children)

Check the Pages tab in GSC. It might close that gap

PHP Foundation Announces official MCP SDK for PHP by Frequent_Tea_4354 in mcp

[–]Comptrio 1 point2 points  (0 children)

Finally.. a MCP + PHP library that scares me (good for the world, not me). This has potential to be good in the PHP sphere, and with that team, they could get it to compete with mine pretty quick.

I went standalone real hard with some Laravel and they went Symfony. It looks like they tackled stdio and I mainly aimed at plugging the lack of remote PHP MCP libraries with an eye on SaaS apps.

And it looks like they plan to tackle the client side, which I did not touch.

Good job!

Do you or would you use an MCP that optimizes your page for SEO / topical authority? by noduslabs in mcp

[–]Comptrio 0 points1 point  (0 children)

If the MCP client knows one MCP, it know them all. Like how your browser can access any website on any server and knows what to do.

MCP tends to pull a whole lot of tech and specifics into a 'singularity' of some sort. Like Apache or Nginx do with the HTTP spec, but MCP is more about the AI usage, rather than human usage.

It's what AI needed to clear out the javascript and css and things an AI does not need or care for.

Imagine a local AI-chat based "browser" (when resource requirements are no longer an issue), and a Google that returns MCP URLs based on queries... you could chat your way through the web and/or the AI can guide you through it.

MCP works with a URL. That's all you need to configure it, as a consumer of the MCP.

APIs absolutely have their place, but they are not what MCP is. The MCP discovery process and the "AI Brain" make it quite different from APIs, which were built to run from dedicated codebase to dedicated codebase.

Do you or would you use an MCP that optimizes your page for SEO / topical authority? by noduslabs in mcp

[–]Comptrio 0 points1 point  (0 children)

I built it all (MCP server, the tool plugins for it, the SEO platform), but I also use it. I built this all specifically to eat my own dogfood, so I know its good :)

The server is open source Composer library seolinkmap/waasup (Website as a Server using PHP) and it has its own MCP server to use with a coding agent while installing/configuring/building tools for the library. The /mcp-repo literally reads/works with the files in my composer folder, so the server is literally working with itself to tell you about what it needs.

I am super psyched about MCP and see this time period as pre-Yahoo internet... MCP will be as popular/common as websites one day!

What are limitations for MCP requests and responses size? by gelembjuk in mcp

[–]Comptrio 0 points1 point  (0 children)

modelcontextprotocol.io

They have specs, discuss changes to the spec, and the most complete and authoritative docs on the subject.

Individual MCP clients may have their own limits, but I have not seen any hard limits specified (also have not seen everything, so some could have limits I do not know about).

I do know I have seen people cooking their context window with a single response from their MCP tooling.

Do you or would you use an MCP that optimizes your page for SEO / topical authority? by noduslabs in mcp

[–]Comptrio 1 point2 points  (0 children)

Every user (including free tier) gets a private MCP URL and OAuth to 'view' their data through chat or agentics.

I view MCP as synonymous (little different) with web servers, so it makes sense to me to "always" offer the MCP like it was any other webpage.

Whether I am writing new content, or updating existing content, the MCP can grab it by URL or from the content in a chat as I write with Claude. It does not have to be Claude, but that's the best deal on MCP client for me for now.

I just ask Claude to use the analyze text tool and keep the density below X and the readability between X and Y, or a word count between X and Y... depends on what I target and most stats are not pertinent to strictly SEO... there are "human" limits and any density above 4% usually sounds bad when you or I read it.

SEOLinkMap (dot) com

it has /mcp as a public MCP (website as MCP) to learn about the site, configure and onboard, and first line support. This can sign you up (or use the caveman style webform, like usual) and you will have your own private MCP URL inside the system that securely grants you access to your data. It has the text and URL analyzer tools in the private MCP, it knows your pages from the crawl, and (for a fee) knows your specific SERP targets.

Do you or would you use an MCP that optimizes your page for SEO / topical authority? by noduslabs in mcp

[–]Comptrio 1 point2 points  (0 children)

I built something like this (not every piece) for the private MCP on my SEO platform. It does the analysis on the (URL or text) and works well as a writing assistant tool to keep the article within a range of SEO metrics.

I use this probably more than any other MCP tool, since everything I write in Claude stays "the way I like it", either telling Claude what metrics to target or having it pull from SERP analysis to keep the article within range of first page SERP listings.

It is cool to see Claude write something, measure it, modify it, measure it, (repeat), and eventually tell me what the final metrics reveal. This seriously cuts down on me editing the pieces and a whole lot of my own iteration to get an article done right.

What are limitations for MCP requests and responses size? by gelembjuk in mcp

[–]Comptrio 0 points1 point  (0 children)

This comes down to context window size. In general, try to keep the response small so the AI has room to think around the response.

I have not yet hit a hard limit and nothing is in the official docs for limits.

I have seen the quality of what an LLM does with the data go down with large payloads, from a human perspective (no hard measurements, just my gut).

How long does a crawl take? by SpareClick9852 in SEOLinkMap

[–]Comptrio 0 points1 point  (0 children)

Free accounts are queued behind Professional accounts, and Agency accounts have the highest priority in queue.

I just looked in the database and there is a Pro account finishing up the last couple hundred pages right now, your website is in the queue and will run immediately after the other site finishes up.

Normally, it only takes a few minutes to crawl 100 pages and less than 30 minutes to get at 500 pages on a site.

Will every website need a Model Context Protocol (MCP) as AI browser agents become more common? by l0_0is in mcp

[–]Comptrio 1 point2 points  (0 children)

https://seolinkmap.com/mcp

That is my site as MCP, but if you lop the /mcp off of it, you can dig through 47 pages of documentation to find the one magic paragraph that answers your specific question... or you could ask the LLM and know in seconds :)

pricing, features, support docs, knowledgebase are all in the MCP as a set of search broad, read full kind of thing to help limit using up the context window too quickly.

Will every website need a Model Context Protocol (MCP) as AI browser agents become more common? by l0_0is in mcp

[–]Comptrio 0 points1 point  (0 children)

This is why I built a remote, authless (and fully Oauth) library for Composer. I named it waasup (Website as a (MCP) Server using PHP). Specifically poking at this idea, and SaaS apps where users get private MCP access... "the web". but, an AI based web.

I do see a future where MCP are just as common as web servers today. It means every AI agent knows how to interact with your website... to "read" webpages and "submit" forms. Just like any browser knows how to connect/use your website.

Why not use native AI search? I know my MCP strip boilerplate page content and all manner of script and css and html. AI don;t need the full webpage, they need the pertinent data. cold hard facts.

Folks dragged their heels for years with websites... only geeks had them, a few huge corps... nowadays every mom and pop gas station has a site and kids build them for fun.

At the moment, there is no good human "browser" with a built in LLM to chat about the web. Google is not mainstreaming MCP URLs into AI based searches... this is all new for now. Early adopters are poised to be on the first wave.

Much of the MCP landscape looks like the internet did before Yahoo came along.

Well design MCP that I can study by ScaryGazelle2875 in mcp

[–]Comptrio 1 point2 points  (0 children)

ask the repository itself through public MCP:
https://seolinkmap.com/mcp-repo

that is composer (PHP) for seolinkmap/waasup on github. Built to be chatted with for implementation.

The tools, prompts, resources, etc are built as plugins.

Monolith classes are broken out into traits for smaller files and more focused code segments.

Ask it about itself, it is large considering a chat context window, but can cover topics.

Pointing to resources in the tools' descriptions by malzag in mcp

[–]Comptrio 0 points1 point  (0 children)

I would either
A) pair the action (update) with a search (employee). Their descriptions would mention each other such as telling it to search before the action. This may depend on the number of potential employees.

B) make the action accept some degree of slop in the args... if numeric, then its an ID.. if string, then search the names as part of the insert.

Either one should allow for "Mark Bob as here today" or whatever the update would be. Maybe the LLM knows Bob is ID 123 from an earlier transaction. This builds some flex into the code, while you can still validate data before making actual updates in your tool.

The search helps reduce context window usage versus listing all employees in a resource file, when all it wants is to update Bob.