Invalid Type for 'Input' Error when Try Using Local LLM

Just try some models in Assistant feature, i use LM Studio with openAI Compatible API.

Some models that fall to this error:

  • Nvidia nemotron nano
  • qwen3
  • GLM-4.6

and this is what i found in LM Studio log:

2026-02-11 09:11:42 [ERROR]



{
  "error": {
    "message": "Invalid type for 'input'.",
    "type": "invalid_request_error",
    "param": "input",
    "code": "invalid_union"
  }
}

We haven’t developed with LM Studio in mind, but it should work if you add a custom openai model in Bootstrap Studio and providing the endpoint to the responses api: Responses | LM Studio Docs

Thanks for respons Martin, but this mostly happpened not in first request. So, after first request successfully done, and i need to reply to next step this problem happen. I usually use this API url in other editor and no issue, so maybe if there is something that i can do for help to figure out the issue with bs studio, i will.

Thanks.

If possible can you test if Ollama works for you?

Ok, let me test wiht ollama tomorrow with same model.

Tested with Ollama with same models, no error appears, but after waiting long respons it seem stuck and get this message,

gambar

Unfortunately I don’t think it’s a problem with Bootstrap Studio. The app just calls the model API as we do with Google/Anthropic/OpenAI/OpenRouter. Maybe the models you’re trying are too small and not “smart enough” to respect the instructions the app gives them. I’ve reliably used gpt-oss, but there are new models constantly being released so the only way is to try them.

1 Like

Hi, tested with LM Studio, with those settings and test connection ok.
But nothing happen in BSS.

All with context @256k256k

With Ollama it’s ok :slight_smile:

But after a moment, at the end of response when nearly finished, BSS freeze.

Is there any minimum criteria for model used? Mostly i use about 8–20B model for local LLM.

Just guess, there problem with handling [object] [object] but BSS doesn’t catch this event as error, so BSS still idle for waiting while ollama or LM Studio already stop processing stream.

Thanks.
It doesn’t happen every time…I noticed this problem when he wanted to create a file.
I need to do more tests.
During another attempt, he did not find the tool, so he asked me to copy paste HTML code and CSS :sweat_smile:.

1 Like