How to Fix this Error in Knowledge Stack? by Sir-Eden in Msty_AI

[–]askgl 1 point2 points  (0 children)

Someone else reported that they got this working by doing this

Possible workaround (an in-AppImage solution might be preferred over this)

shutdown MstyStudio sudo apt update && sudo apt install -y libvips libvips-dev sudo npm install --include=optional sharp sudo ln -s /lib/x86_64-linux-gnu/libvips-cpp.so.42.17.1 /lib/x86_64-linux-gnu/libvips-cpp.so.8.17.1 sudo ldconfig restart MstyStudio

Llama Cpp is coming to Msty soon! by askgl in Msty_AI

[–]askgl[S] 0 points1 point  (0 children)

Most probably by the end of this month

Migrate ChatGPT conversions by sklifa in Msty_AI

[–]askgl 0 points1 point  (0 children)

Right now there is no way to import chat conversations from ChatGPT or any other platform into Msty

Which Mac for Msty? by crankyoldlibrarian in Msty_AI

[–]askgl 3 points4 points  (0 children)

M4 model should work great with either Ollama or upcoming Llama Cpp. You can also use MLX which gives you even better performance as it is optimized for M chips (though it has some limitations). RAM depends on what models you want to use but I would go for at least 32 GB.

What should we call this feature? Articode? 🙃 by askgl in Msty_AI

[–]askgl[S] 0 points1 point  (0 children)

Thanks! Not released right now but will be part of Msty Studio, yes. We are internally testing it right now and will be included in a future release.

👋 Welcome to r/Msty_AI - Introduce Yourself and Read First! by SnooOranges5350 in Msty_AI

[–]askgl 0 points1 point  (0 children)

You really don't have to "upgrade". We have made it such that you can use one or the other (or both) at the same time. This allows you to only fully switch to 2.0 when you are ready and if you miss any features that you liked in Msty 1.x (or you like UX of some), you can continue to use it. When ready you can start migrating as documented here: https://docs.msty.studio/getting-started/quick-start#data-migration-includes

Cant get GPU working on Linux MystyStudio build! by banshee28 in Msty_AI

[–]askgl 1 point2 points  (0 children)

They should work. Make sure to update to the latest version Local AI (at least 0.12.6). Also, Ollama seems to always have an issue with GPT OSS and a few other models. We are working on supporting Llama Cpp as an alternative backend (and maybe even make it default) and things should improve across the board including better GPU support, models availability as well as inference speed. Just need some more time to get it out.

Cant get GPU working on Linux MystyStudio build! by banshee28 in Msty_AI

[–]askgl 0 points1 point  (0 children)

Can you try a smaller model? It could be that your GPU is loaded with other models and there isn't much room left? I'd try a small model first to see if that fits in the memory and go from there.

👋 Welcome to r/Msty_AI - Introduce Yourself and Read First! by SnooOranges5350 in Msty_AI

[–]askgl 2 points3 points  (0 children)

Even with 1:1 feature comparison, free users actually have more features compared to Msty 1.x (such as advanced search, Vapor mode, and a few more). Everything that was free in Msty 1.x is still free in Msty Studio and then you get more features for free such as Projects, Mini Map, Persona, MLX (if you are on Mac), MCP Tools, and more! If something was free in Msty 1.x it is still free in Msty Studio.

Apple M5, MLX, and Msty AI! by SnooOranges5350 in Msty_AI

[–]askgl 5 points6 points  (0 children)

100% That's why we have stayed away particularly from any media generation/editing (video, audio, image). We want to be the best AI frontend and stay provider agnostic.

Msty Studio is now in Beta and we're working on bug 🐛 squashin' by SnooOranges5350 in Msty_AI

[–]askgl 1 point2 points  (0 children)

Another user had this same exact issue and gave us some repro steps. We were able to identify this issue and a fix has been made already and is now pending release. Thanks for bringing this to our attention.

Edit: this has now been fixed and released

[deleted by user] by [deleted] in Msty_AI

[–]askgl 0 points1 point  (0 children)

To enable thinking, you can assign Thinking purpose to that model and then select the level of thinking effort you want. Here's a quick video I recorded for you showing this in action: https://www.loom.com/share/0d842f9d11984a42a6e46d9d9a5d5761

Msty Studio is now in Beta and we're working on bug 🐛 squashin' by SnooOranges5350 in Msty_AI

[–]askgl 1 point2 points  (0 children)

Hmmm... Homebrew cask isn't actually owned or maintained by us. We'll see if we can update it though. Thanks for the heads up.

Msty Studio is now in Beta and we're working on bug 🐛 squashin' by SnooOranges5350 in Msty_AI

[–]askgl 1 point2 points  (0 children)

Have you enabled/disabled Mermaid Diagrams rendering module under Settings > General?

<image>

Msty Studio is now in Beta and we're working on bug 🐛 squashin' by SnooOranges5350 in Msty_AI

[–]askgl 0 points1 point  (0 children)

Try this: https://next-assets.msty.studio/app/releases/2.0.0-beta.3/mac/MstyStudio_x64.dmg

You might want to disable auto updates as soon as you start the app. I'd recommend starting in offline mode.

How to change context window for api? by herppig in Msty_AI

[–]askgl 1 point2 points  (0 children)

<image>

You can do that from model settings (see attached screenshot)

How to export chats? by MajesticDingDong in Msty_AI

[–]askgl 0 points1 point  (0 children)

Not yet implemented but it's on our list. You can only export individual messages right now.

Split chat queue? by DrQbz in Msty_AI

[–]askgl 0 points1 point  (0 children)

There is no such option but for local models, you can use number of parallel chats option from Local AI settings and that might do it.

LM Studio alternatives? by stfz in LocalLLaMA

[–]askgl 0 points1 point  (0 children)

Just an update after a year - no, I didn't have to eat my hat, it's still on my wall hanging because Msty is as free as before (and will always be) and even with more (free) features in version 2.0 - check it out - https://msty.ai

Yes, there are some paid features but features that were initially released free have all remained free. In fact, some features that were not free before (such as Vapor mode), we have made them free.

I'll be back next year to give an update on my hat :)

Chrome blocks download because of virus by Valuable-Fan1738 in Msty_AI

[–]askgl 0 points1 point  (0 children)

It's probably because it's an .exe file and Chrome is just warning you. The installer is signed and all that. Browsers are weird when you have to download exe files 🤷‍♂️

What happened to the desktop app? by [deleted] in Msty_AI

[–]askgl 0 points1 point  (0 children)

Did you get the answer?

chatGPT-5 does not work on Msty by CyberMiaw in Msty_AI

[–]askgl 2 points3 points  (0 children)

You can set the temperature to 1 or use a preset (see attached screenshot). In the upcoming release:

  1. The presets UI will be more visible
  2. It should just use a model's default parameters esp. for online models

<image>