|
- Onboarding - Msty Docs
This documentation covers Msty App v1 x, the original version of Msty We’re now actively building and expanding Msty Studio, the next-generation app (v2 0) with a growing set of features and capabilities
- Download Offline Models - Msty Docs
Msty lets you download a wide variety of models to use offline with Local AI You can choose to install any model from Ollama or import supported gguf model files from HuggingFace, directly within Msty
- AMD ROCm on Windows Issues - Msty Docs
Msty supports AMD ROCm on Windows out of the box and we even have a dedicated installer for it However, sometimes things can go wrong and your GPU card might not be supported or detected at all
- GPUs Supported by Msty - Msty Docs
Msty supports a wide range of GPUs for faster inference Check if your GPU is supported by Msty
- OpenAI Deep Research like service with Msty - Msty Docs
Any model you have in Msty can be used by changing the model names in the config If you want to use the default settings, make sure to have mistral-small and deepseek-r1:14b are available in Msty and named as is
- Miscellaneous Troubleshooting - Msty Docs
Reinstall the app by first deleting lib folder and ensuring msty-local is not running If issue persists, please join our Discord and ask for help in the #msty-app-help channel
- Real-Time Data - Msty Docs
Msty's Real-Time Data feature allows you to fetch live data from the internet and use it to enrich your chat conversations You can easily toggle this feature on or off as needed
- Get the latest version of Local AI service - Msty Docs
For your convenience, Msty bundles the latest version of Local AI service (Ollama) with the app at the time of the app release However, if you want to get the latest version of Local AI service, first try going to Settings > Local AI > Service Version and clicking on Check for Updates
|
|
|