Exploring applications for local LLM interaction

I’m curious about what GUIs or applications y’all are using to interact with your local LLMs—whether on mobile, desktop, or web.

I’m currently running LM Studio and experimenting with a few models. LM Studio has a server mode, and I’m looking into using Tailscale or Cloudflared to remotely connect to my local LLM. Ideally, I’d love a setup that lets me:

:white_check_mark: Access my local LLM remotely

:white_check_mark: Switch models easily

:white_check_mark: Have a smooth GUI or API interaction

For those already running a local LLM, what’s your setup?

• Do you use Ollama, LM Studio, Text Generation WebUI, or something else?

• How do you connect to it remotely?

• What’s your preferred interface—web UI, CLI, mobile app, or a custom API?