Oolama/LLM support

Is it feasible to integrate support for local LLMs through the Ollama API as an additional option next to Chat GPT? The Ollama API is fully compatible with the OpenAI API, which should simplify the integration of local LLM support via Ollama.

Ollama is run locally and is free to use (no OpenAI API key or internet connection required) and provides users with greater control.

See: OpenAI compatibility · Ollama Blog

8 Likes

I agree… local and more API access options for coding capable LLMs would be good.

1 Like

Thank you for the suggestion! We would like to hear more users expressing interest before committing to adding it to the app. So if others wish to see this added in Bootstrap Studio, write below.

5 Likes

For those unfamiliar with LLM, Ollama is a small program runnable in the background on your PC, granting local access to various large language models, akin to ChatGPT/OpenAI. There are several advantages:

  • It’s entirely free; no API key or OpenAI account required.
  • Usable without internet access, useful for travel or airplane scenarios.
  • Offers flexibility; you can select your preferred AI LLM model.
  • Ensures security; no code ever leaves your local PC.

Yes yes yes. I use Jan and Ollama.

1 Like

Yes!!! And LM Studio too, fully compatible with the OpenAI API with url: “http://localhost:1234/v1

1 Like

I would also be very interested in this feature.

Yes, please! Local LLMs are getting better and better.

I use LM Studio a lot, being able to add our own API link to Bootstrap Studio would be great.

Ollama would be a great option

Please add Ollama support

I think this is an essential feature!, a must have since many people run local open source AIs and they are getting very good at coding! Should be pretty easy to implement, ask the AI :laughing:

Did anybody try this? Theoretically it should just work by point to the local OpanAI compatible API, am I wrong?

you mean you are already using local llms in BSS just by plugging them in?

No, what I meant was that I use a local LLMS for different tasks. In my mind, while ollama runs, Bootstrap Studio can connect to ollama instead of requiring Open AI.

Hopefully, it would be possible with future releases.

1 Like

Yes, please. Add it and we need it.

if it has an OpenAi compatible API theoretically it should work, no? did you try it? I am actually using the AI feature in BS a lot, it works ok with chatgpt, but I am very curious what happens if you plug in a coding model that runs locally!

I completely agree with the previous speakers—it’s worth offering more API integration options for AI models beyond just GPT.
For instance, Claude 3.7 excels in this area. Expanding integration capabilities could be a strong competitive advantage for your product.

1 Like

Support for the OpenRouter API would also be great, they are a lot more flexible than OpenAI (more models available in variable price classes) in addition to local options such as Ollama and LMstudio.

1 Like

Hi, same for me, local llm would be a great plus.