Currently the LLM parameters can only be configured directly in the code (with exception of some ENV variables like MODEL_TEMP (temperature)).
See model settings for supported values. Discuss which settings should be included (e.g. reasoning settings for GPT-5) or if it should be possible to pass in a dict.