It will be great if you can implement OLLAMA as well as openai o4-mini and many other models
Currently using LM Studio. Would love to see your great Dev guys implement that and OLLAMA into BSS.
![]()
Iām also a regular Bootstrap Studio user and would love to see support for local LLMs like Ollama in a future update. Many of us work in environments where offline development is important, and being able to generate content/code with a local API would be a huge boost to productivity.
The current AI features are great, but if we could point Bootstrap Studio to a local endpoint (e.g. http://localhost:11434/api) with OpenAI-compatible API support, it would open up so many more possibilities ā especially for those using custom or open-source models.
Please consider adding this flexibility in an upcoming release! ![]()
Yes I logged into to the forum to see if there were any plans for it. This would be a must have imo. Ideally support for LM studio and oollama desktop.
Iād love to see Ollama support addedāsounds exciting!
Yes, I agree a lot with Ollama also Openroute to bring AI
+1
On my part, Iām using a remote Ollama (on a domain reachable through a reverse tunnel) on my desktop when Iām on my laptop, so I think the URL should stay configurable (ie. no fixed http://localhost:11434).
+1
My data, my rulesā¦Local AI must be supported
+1.
I definitely like local AI support .
+1. I am keen on connecting to LLMs hosted on AWS/Azure with OpenAPI compatible endpoints. All it will take is changing the endpoint.
that will be nice if we have the option to have our locall LLM or other companies that provide api access to there llms, please include this in your update
yeah please add it, it will be very helpfull for us