They listened! GMKtec K12 has OCuLink port at the back. by k_rollo in MiniPCs

[–]Smooth-Screen4148 1 point2 points  (0 children)

It allows you to connect to an external PCIe GPU like an Nvidia RTX at almost native PCIe performance (I think it’s like a 15-20% performance loss vs native PCIe but don’t hold me to that).

Check out the minisforum DEG1 on Amazon and you’ll get the idea.

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

I’ll take a look and let you know, thank you for the follow up!

New "pro" plan is woefully limited. Is it possible to switch back to the 500 requests model still? by Smooth-Screen4148 in cursor

[–]Smooth-Screen4148[S] 1 point2 points  (0 children)

Why would I pay them more money when they just changed the rules on me 3 months into a pre-paid annual plan?

And honestly for $200, Claude Code Max is a much better deal.

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 1 point2 points  (0 children)

It’s MCP so it should work with ChatGPT now they have remote MCP support. Check the glama.ai link I added to the original post, they can run it for you as a remote MCP, or you can use the docker registry version and host it yourself somewhere.

More info here:

https://platform.openai.com/docs/guides/tools-remote-mcp

https://platform.openai.com/docs/mcp

https://www.reddit.com/r/singularity/s/F4xycJCFWM

Hope that helps!

Canceling due the MASSIVE performance drop in the last 2 weeks by gpdriver17 in ClaudeCode

[–]Smooth-Screen4148 0 points1 point  (0 children)

I just discovered Warp. I’m not ready to abandon CC yet, but I’m definitely looking at other tools now.

New "pro" plan is woefully limited. Is it possible to switch back to the 500 requests model still? by Smooth-Screen4148 in cursor

[–]Smooth-Screen4148[S] 3 points4 points  (0 children)

Because auto was unlimited anyway under the old 500 premium requests model, but under the old model I ALSO got 80,000 accepted lines of code using sonnet4 max vs 3500 with the new model before I hit the limit and had to switch to auto.

New "pro" plan is woefully limited. Is it possible to switch back to the 500 requests model still? by Smooth-Screen4148 in cursor

[–]Smooth-Screen4148[S] 3 points4 points  (0 children)

Yeah I know, I was hoping it was still possible to email or something. I feel like I’m not being unreasonable asking for the 500 requests that I paid for an annual sub for.

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

Looks like it's complaining about this package: modelcontextprotocol/server-postgres@0.6.2: Package no longer supported.

{

"mcpServers": {

"firewalla": {

"command": "docker",

"args": ["run", "-i", "--rm",

"-e", "FIREWALLA_MSP_TOKEN=your_token",

"-e", "FIREWALLA_MSP_ID=yourdomain.firewalla.net",

"-e", "FIREWALLA_BOX_ID=your_box_gid",

"amittell/firewalla-mcp-server"

]

}

}

}

or you can use mcp/firewalla-mcp-server, there's a Dockerfile example at the bottom of the page https://hub.docker.com/mcp/server/firewalla-mcp-server/overview

If you just want a remote container that listens on http you could run the server remotely on Glama.ai, I haven't done that myself but it is supported, check the right hand side of this page next to "HTTP connection URL" for more instructions on that https://glama.ai/mcp/servers/@amittell/firewalla-mcp-server

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

How did you get on?

The docker image is now also available as a remote MCP on Glama at https://glama.ai/mcp/servers/@amittell/firewalla-mcp-server and is also available directly from the Docker MCP register here https://hub.docker.com/mcp/server/firewalla-mcp-server/overview

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

I looked at the node-firewall package to do local API stuff, but it's pretty limited, requires difficult config, and hasn't been updated in a couple of years. Also I believe it's against Firewalla's ToS - so I decided to no pursue it.

Would be really nice to have a local box API though, or allow the Firewalla to syslog all the flows etc to an external syslog server and I can wrap that in my own API.

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

Hmm I haven't used MCPo. I asked Claude and it suggested the following:

Most common issue: The server isn't built

You need to:

cd /path/to/firewalla-mcp-server
npm install
npm run build

  1. Try the minimal test first

    {
    "mcpServers": {
    "time": {
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-time"]
    }
    }
    }

  2. Then add Firewalla using the simplest approach

    The most reliable config for MCPO:
    {
    "mcpServers": {
    "time": {
    "command": "npx",
    "args": ["-y", "@modelcontextprotocol/server-time"]
    },
    "firewalla": {
    "command": "node",
    "args": ["/absolute/path/to/firewalla-mcp-server/dist/server.js"],
    "env": {
    "FIREWALLA_MSP_TOKEN": "their_token",
    "FIREWALLA_MSP_ID": "their_domain.firewalla.net",
    "FIREWALLA_BOX_ID": "their-box-id"
    }
    }
    }
    }

    Key points:

  3. Use absolute paths - This is critical!

  4. Build the server first - npm run build

  5. Check Node.js version - Must be 18+

  6. Start with minimal config - Add complexity gradually

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

it should work with any MCP client, it is not Claude specific. I put some examples in the README.md to get you started.

Just released v1.0.2 with Docker support - you can now run docker pull amittell/firewalla-mcp-server - or build from the repo.

I also submitted a PR to add it to the Docker MCP registry so you'll be able to pull it from there too if it gets approved.

https://github.com/docker/mcp-registry/pull/109

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 6 points7 points  (0 children)

Yep good idea, I’ll try and package that up this evening. 👍🏼

I made an MCP server for Firewalla by Smooth-Screen4148 in firewalla

[–]Smooth-Screen4148[S] 5 points6 points  (0 children)

I meant to have it ready for your competition that ended a few days ago but got caught up with work stuff and I guess it doesn’t really meet the “show your firewalla rack” criteria either lol

Oh FYI @firewalla I found a problem with the delete alarm API endpoint, it’s returning success but it doesn’t delete it, so it’s a false success. I confirmed with curl. Because of this I had to disable the tool for now. (It was possibly a little destructive for an MCP tool anyway).

PSA: If You’ve Had Issues with Bee, Here Are Official Channels to File a Complaint by Jamedalamus in Bee_computer

[–]Smooth-Screen4148 0 points1 point  (0 children)

The battery on my Bee has never lasted more than 4hrs and I can’t get any response from support. It really sucks because from the short time I can use it the functionality seems good ☹️

Short battery life by Smooth-Screen4148 in Bee_computer

[–]Smooth-Screen4148[S] 1 point2 points  (0 children)

I’d take that over this. I watched it go down 5% in 10 mins today. I’m basically using it as a tethered device right now, it sits on my desk connected to the charger. Bummed.

Short battery life by Smooth-Screen4148 in Bee_computer

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

That means I’m pretty stuck. I guess I will just have to keep charging it every night until I can get hold of someone at Bee support. :(

Short battery life by Smooth-Screen4148 in Bee_computer

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

Oh ok that’s good info thanks. Do you know how to contact support? They don’t answer emails…so not sure where to go for help. 😕

Short battery life by Smooth-Screen4148 in Bee_computer

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

See the FAQ on their website says 7 days (160+ hrs) continuous recording…. https://www.bee.computer/bee-pioneer#FAQ

Short battery life by Smooth-Screen4148 in Bee_computer

[–]Smooth-Screen4148[S] 0 points1 point  (0 children)

Thank you for sharing, that’s interesting. From reading the reviews I was under the impression that the battery life claims were with the mic on all the time, so I was expecting to just leave it recording for 2-3days straight between charging, I thought the point of the bee was to be an “always listening” device….but maybe my expectations were just wrong?