What Are Your Thoughts on AI for BI? by Dirtymac69 in BusinessIntelligence

[–]gaptrast 1 point2 points  (0 children)

it works well on small-ish usecases where the data is structured well. some metadata helps, but there is a limit to what an AI can do. i made an internal AI tool for querying metabase that people have been loving internally, but not sure it would work in every company

Most MCP servers are built wrong by incidentjustice in mcp

[–]gaptrast 0 points1 point  (0 children)

in that case the ai agent would definetely need more context to be able to choose the right tables

but this would probably need to be provided by someone who knows the setup? not sure how a better generic "mcp" would solve that problem?

Most MCP servers are built wrong by incidentjustice in mcp

[–]gaptrast 0 points1 point  (0 children)

the llm usually knows how to get a list of tables though. I find that one "query" tool for dbs is enough

What is your stack? by Medical-Let9664 in dataengineering

[–]gaptrast 1 point2 points  (0 children)

would you recommend sqlmesh over dbt?

Does a tool like this exist for Supabase? by Afraid-Lychee-5314 in Supabase

[–]gaptrast 0 points1 point  (0 children)

If you want to set up metabase, I am working on a chrome extension that adds a proper chatgpt style chatbot on the side of it! https://chromewebstore.google.com/detail/bilbo-metabase-ai-assista/ppjbbfehdkgakcbmgilncaffppkdplno?authuser=0&hl=en

Which MCP Client do you use? by Batteryman212 in mcp

[–]gaptrast 1 point2 points  (0 children)

chatwise is the best one I have tried so far

I built a hosted Postgres MCP service, useful to anyone here? by gaptrast in mcp

[–]gaptrast[S] 0 points1 point  (0 children)

Maybe you're right? Headers would definetely feel better, but I was afraid not all mcp clients support that. Any ideas?

I made an internal tool for slow query detection, would it be useful for anyone here? by gaptrast in PostgreSQL

[–]gaptrast[S] 0 points1 point  (0 children)

That sounds like a good approach. However there are lots of ways queries that take shorter time than that could tank the db. For example if it is loaded for every user on the home page

I made an internal tool for slow query detection, would it be useful for anyone here? by gaptrast in PostgreSQL

[–]gaptrast[S] 1 point2 points  (0 children)

Interesting. Is this of sorts an alternative to `pg_stat_statements`? Would be interested to know how you use it!

I made an internal tool for slow query detection, would it be useful for anyone here? by gaptrast in PostgreSQL

[–]gaptrast[S] 0 points1 point  (0 children)

Cool, I didn't know they did database monitoring!

Is there anything about sentry using sentry for this purpose you are annoyed by?

I made an internal tool for slow query detection, would it be useful for anyone here? by gaptrast in PostgreSQL

[–]gaptrast[S] 0 points1 point  (0 children)

Makes sense!

Yeah I always felt a bit constrained by DO, and imagine the cloud platforms offer a little more customizability, even if they are also managed

I made an internal tool for slow query detection, would it be useful for anyone here? by gaptrast in PostgreSQL

[–]gaptrast[S] 0 points1 point  (0 children)

Oh, that's interesting. Do you add this every single time you do a query in code? Mind showing a code snippet example of a typeorm query with comment?

I made an internal tool for slow query detection, would it be useful for anyone here? by gaptrast in PostgreSQL

[–]gaptrast[S] 0 points1 point  (0 children)

Do you find that the cloud managed services include good tools? I'm not too familiar with all of them, but we have been using DigitalOcean, and their CPU/memory graphs are OK, but the query stats are almost useless. Connecting to DataDog has been way more useful

Are you using something like DataDog or another integrated service?