local ai on mac, native apps compared
of these i only actually downloaded gpt4all, chital, anythingllm, and lm studio
lm studio was a pretty clear winner for me
app | pros | cons |
---|---|---|
gpt4all | really easy to add a directory for embedding pretty nice interface | messy installation doesn’t play well with ollama downloads? can’t narrow window less than 2/3 of screen |
chital | very lightweight (18 MB app) starts instantly very FOSS wonderfully clean interface | basically no additional features beyond CLI, such as syntax highlighting or docs |
anythingllm | workspace separation logic can handle directories, not as elegant as gpt4all | kinda ugly upload & embed GUI is finicky and tedious workspaces can overcomplicate things |
lm studio | can link ollama downloaded models using gollama pretty feature-rich ability to limit unneeded UI elements (user mode vs developer mode) great information display for models really solid in-gui huggingface model browser indicates whether a model can run on your machine (!) mlx support (not sure if other apps have this?) straightforward installation epic server interface | hard to complain connection to outside apis appears limited/not possible, this is a local only deal but the server pane makes it attractive as the base manager for models to be served to clients like open webui if other features are needed attachments are possible, but not quite the same level as anythingllm and gpt4all as far as i could see |
ollamac (FOSS version) | seems very similar to Chital, but more formatting/graphic design FOSS syntax highlighting | unclear if supports docs |
chatd | can add docs minimal interface | not very pretty only 1 file at a time? no syntax highlighting |