We haven’t developed with LM Studio in mind, but it should work if you add a custom openai model in Bootstrap Studio and providing the endpoint to the responses api: Responses | LM Studio Docs
Thanks for respons Martin, but this mostly happpened not in first request. So, after first request successfully done, and i need to reply to next step this problem happen. I usually use this API url in other editor and no issue, so maybe if there is something that i can do for help to figure out the issue with bs studio, i will.
Unfortunately I don’t think it’s a problem with Bootstrap Studio. The app just calls the model API as we do with Google/Anthropic/OpenAI/OpenRouter. Maybe the models you’re trying are too small and not “smart enough” to respect the instructions the app gives them. I’ve reliably used gpt-oss, but there are new models constantly being released so the only way is to try them.
Just guess, there problem with handling [object] [object] but BSS doesn’t catch this event as error, so BSS still idle for waiting while ollama or LM Studio already stop processing stream.
Thanks.
It doesn’t happen every time…I noticed this problem when he wanted to create a file.
I need to do more tests.
During another attempt, he did not find the tool, so he asked me to copy paste HTML code and CSS .