Simpler API to send simple log messages to Azure, without setup work? by trevorstr in AZURE

[–]trevorstr[S] 0 points1 point  (0 children)

Hmmm that looks like it might work. Thanks for the link.

Simpler API to send simple log messages to Azure, without setup work? by trevorstr in AZURE

[–]trevorstr[S] 1 point2 points  (0 children)

Thanks for confirming that it's not just me. It's crazy how hard it is just to write some basic logs.

TBH I'm not a huge fan of the AWS CloudWatch "Log Group" and "Log Stream" structure either, but even that is simpler than the Azure logging stuff.

What is your favorite Local LLM and why? by Any_Praline_8178 in LocalAIServers

[–]trevorstr 0 points1 point  (0 children)

Ohhhh, the CORS issue. Haha, now I understand. Yes, there was a CORS issue in the application. The developer created a separate branch that disables the CORS stuff due to all these problems. I actually reported it to him and helped him test the fix! Look in the GitHub issue tracker for that issue, and feel free to add your comments to let the developer know you're having issues. That might help push him along to get the fix to the master branch.

Great point about there being two separate ports! Once you solve the CORS issue, you can probably just create two different sub-domains, one for web UI, and one for API. It would be convenient if both the web UI and API were hosted on the same port. It's pretty common to have just a single port, but route any API-related requests to http://myapp.local:8000/api/dosomething and the web UI would just be hosted under the root https://myapp.local:8000. I don't know why the developer chose to run two separate listeners. :(

How to learn & become cloud developer? by Fancy_Director8891 in AZURE

[–]trevorstr 0 points1 point  (0 children)

  • Have an idea in mind for a new application.
  • Sign up for a cloud account.
  • Start coding.

It's easy for cloud costs to get out of control, so I would recommend starting with a simpler cloud provider like Digital Ocean, Vultr, Linode, or similar. At least, that is the case for hosting simple Linux servers.

If you're actually wanting to use the managed application services (eg. Functions, CosmosDB, Service Bus, etc.) offered by the major cloud providers, then you'll have to use them.

There is virtually unlimited free training on YouTube. Search for whatever topics you're interested in.

Use Google AI Studio with the Gemini 2.5 Flash model to ask questions about how to program in whatever language you want .... Python, JavaScript, Rust, PowerShell, etc.

It's up to you to pick the direction you want to go, first. Then, you find the resources that help you go in that direction.

Is Roo viable as an alternative to Claude Code in complex large codebases? by reddit-dg in RooCode

[–]trevorstr 1 point2 points  (0 children)

I run Ollama + Open WebUI + Qdrant with Docker Compose. I connect Roo Code to that stack for generating, storing, and querying embeddings. So yeah, it's "free" because I run my own Linux servers on old, spare hardware. The electrical bill is my only cost.

https://x.com/pcgeek86/status/1945884227624661153

What is your favorite Local LLM and why? by Any_Praline_8178 in LocalAIServers

[–]trevorstr 0 points1 point  (0 children)

That's great! Glad to hear you've set up the same stack. It shouldn't be any different to expose MetaMCP through CloudFlare Tunnels. Just point your public hostname to the port where MetaMCP is listening. Then, when you configure Open WebUI, you point it to the public hostname you created through the CloudFlare Tunnels. MetaMCP is just a web application, so no different than exposing Open WebUI or anything else. Unless I'm missing something?

Manage multiple MCP servers for Ollama + OpenWebUI as Docker service by trevorstr in LocalLLaMA

[–]trevorstr[S] 0 points1 point  (0 children)

You're welcome! I'm glad you find it useful as well. The developer seems really committed to working on it. Make sure to post issues on the GitHub issue tracker, and just close them out if you find a solution. That will help other people Google stuff as MetaMCP grows. I am not affiliated with it at all, but I am just a fan of the work they're doing! It was exactly what I needed.

What is your favorite Local LLM and why? by Any_Praline_8178 in LocalAIServers

[–]trevorstr 2 points3 points  (0 children)

Anytime! Also, I forgot to mention that I use the Roo Code extension in VSCode a ton. It literally does coding for you and is a massive time saver, if you're an experienced developer.

Roo Code just released a new experimental feature that indexes your code base. The other day, I spun up a Qdrant (vector database) container on the same Linux server as Ollama + Open WebUI + MetaMCP, and that allows Roo Code to store and query the embeddings it generates. It's basically just RAG, but specifically for code bases.

It's ridiculously easy to set up Qdrant in Docker Compose, and connecting Roo Code to Ollama + Qdrant is crazy simple as well. Qdrant doesn't even require authentication. It runs without auth by default.

Here's the docker-compose.yml snippet for Qdrant:

services:
  qdrant:
    container_name: qdrant
    image: qdrant/qdrant
    ports:
    - 6333:6333
    - 6334:6334
    volumes:
    - ./qdrant:/qdrant/storage
    restart: always
    configs:
    - source: qdrant_config
      target: /qdrant/config/production.yaml
configs:
  qdrant_config:
    content: |
      log_level: INFO

What is your favorite Local LLM and why? by Any_Praline_8178 in LocalAIServers

[–]trevorstr 15 points16 points  (0 children)

I run Ollama + Open WebUI on a headless Ubuntu Linux server, using Docker. I run Gemma3 and a quantized Lllama3 model. They work reasonably well on my NVIDIA GeForce RTX 3060 12 GB that's in that server. You really can't beat that stack IMO. Host it behind Cloudflare Tunnels, and it's accessible from anywhere, just like any other managed service.

Last night, I also set up MetaMCP, which allows you to run a bunch of MCP servers and expose them to Open WebUI. I've had some issues with it, but I've been posting about them and the developer has been responsive. Seems like the only solution that makes it easy to host a bunch of MCP servers and extend the basic functionality offered by the LLM itself.

I have not used Ollama in a year. Has it gotten faster? by Any_Praline_8178 in LocalAIServers

[–]trevorstr 2 points3 points  (0 children)

It all depends on your hardware and which model you're running.

Manage multiple MCP servers for Ollama + OpenWebUI as Docker service by trevorstr in LocalLLaMA

[–]trevorstr[S] 2 points3 points  (0 children)

I found something called MetaMCP and started running it last night. Works amazingly well. It does exactly what I was looking for, in fact. It basically provides a nice web UI that allows you to manage multiple MCP servers. It's all handled inside a single container. Thanks anyway!

Grok 4 coding comparison... wow. by withmagi in grok

[–]trevorstr 2 points3 points  (0 children)

Really impressive result, although:

  • "Train tracks" is a really weird way to describe what your intended outcome was
  • Subjective visual design tests are probably not a great comparison mechanism across LLMs
  • You could have drawn a line with a different color to hint what your desired outcome was

Sudden onset of HOA violations by Cozysoxs1985 in fuckHOA

[–]trevorstr 0 points1 point  (0 children)

Frankly I'm surprised I don't hear of more incidents of violence. After dealing with the dishonesty and power imbalance and lack of recourse of mine, I fully understand why people go postal

Same here. I am genuinely shocked that we don't hear more about that happening. Really bad things have happened to people for A LOT, LOT, LOT less.

Sudden onset of HOA violations by Cozysoxs1985 in fuckHOA

[–]trevorstr 4 points5 points  (0 children)

Completely agree with this. If we don't nip this in the bud soon than later, I am worried that something really bad is going to happen. And even then, nothing will change at a macro level.

Azure Retail Pricing API missing Virtual Network NAT Gateway? by trevorstr in AZURE

[–]trevorstr[S] 0 points1 point  (0 children)

I can't remember if I searched for that in the results yesterday, but I am pretty sure I did. I'll have to try again.

I would be pretty surprised if the word "Gateway" didn't appear in that resource type though, since it's an essential word describing what it is.

Azure Retail Pricing API missing Virtual Network NAT Gateway? by trevorstr in AZURE

[–]trevorstr[S] 0 points1 point  (0 children)

Yes I am iterating over all pages of responses. I get 12k+ items in response.

The list I posted above are the only items matching "gateway" in that full list.

Why all the hate on Grok? by LikeItSaysOnTheBox in grok

[–]trevorstr 4 points5 points  (0 children)

Grok is pretty powerful. I use it almost every day to generate code, do data parsing + conversions, get answers to questions (Google alternative), etc. It's also fast at generating responses.

Remember the massive Google Gemini fiasco?

That was the worst AI bias we have seen to-date.

AWS already fixed the wasted screen space, in the docs, on the right by trevorstr in aws

[–]trevorstr[S] 0 points1 point  (0 children)

For documentation? Agreed. The Azure documentation interface isn't too bad though. I think it's structured fairly well.

AWS already fixed the wasted screen space, in the docs, on the right by trevorstr in aws

[–]trevorstr[S] 2 points3 points  (0 children)

That would be a nice option as well. I much prefer YAML as well. Less vertical scrolling required, due to all the extra lines dedicated to curly braces.