Oolama/LLM support

Is it feasible to integrate support for local LLMs through the Ollama API as an additional option next to Chat GPT? The Ollama API is fully compatible with the OpenAI API, which should simplify the integration of local LLM support via Ollama.

Ollama is run locally and is free to use (no OpenAI API key or internet connection required) and provides users with greater control.

See: OpenAI compatibility · Ollama Blog

1 Like

I agree… local and more API access options for coding capable LLMs would be good.

Thank you for the suggestion! We would like to hear more users expressing interest before committing to adding it to the app. So if others wish to see this added in Bootstrap Studio, write below.

For those unfamiliar with LLM, Ollama is a small program runnable in the background on your PC, granting local access to various large language models, akin to ChatGPT/OpenAI. There are several advantages:

  • It’s entirely free; no API key or OpenAI account required.
  • Usable without internet access, useful for travel or airplane scenarios.
  • Offers flexibility; you can select your preferred AI LLM model.
  • Ensures security; no code ever leaves your local PC.

Yes yes yes. I use Jan and Ollama.

Yes!!! And LM Studio too, fully compatible with the OpenAI API with url: “http://localhost:1234/v1