Is it feasible to integrate support for local LLMs through the Ollama API as an additional option next to Chat GPT? The Ollama API is fully compatible with the OpenAI API, which should simplify the integration of local LLM support via Ollama.
Ollama is run locally and is free to use (no OpenAI API key or internet connection required) and provides users with greater control.
Thank you for the suggestion! We would like to hear more users expressing interest before committing to adding it to the app. So if others wish to see this added in Bootstrap Studio, write below.
For those unfamiliar with LLM, Ollama is a small program runnable in the background on your PC, granting local access to various large language models, akin to ChatGPT/OpenAI. There are several advantages:
It’s entirely free; no API key or OpenAI account required.
Usable without internet access, useful for travel or airplane scenarios.
Offers flexibility; you can select your preferred AI LLM model.
Ensures security; no code ever leaves your local PC.
I think this is an essential feature!, a must have since many people run local open source AIs and they are getting very good at coding! Should be pretty easy to implement, ask the AI
No, what I meant was that I use a local LLMS for different tasks. In my mind, while ollama runs, Bootstrap Studio can connect to ollama instead of requiring Open AI.
Hopefully, it would be possible with future releases.
if it has an OpenAi compatible API theoretically it should work, no? did you try it? I am actually using the AI feature in BS a lot, it works ok with chatgpt, but I am very curious what happens if you plug in a coding model that runs locally!
I completely agree with the previous speakers—it’s worth offering more API integration options for AI models beyond just GPT.
For instance, Claude 3.7 excels in this area. Expanding integration capabilities could be a strong competitive advantage for your product.
Support for the OpenRouter API would also be great, they are a lot more flexible than OpenAI (more models available in variable price classes) in addition to local options such as Ollama and LMstudio.