Request support for defining custom instructions per LLM configuration, allowing multiple logical “personas” (e.g. engineering vs product) to be configured against the same underlying LLM provider and API key.
This would enable teams to reuse a single LLM backend while offering different response styles through Sourcebot (web UI and MCP).
Problem Statement
Sourcebot supports configuring one or more language models, but in practice:
Teams often want multiple response styles, e.g.:
-Engineering‑focused (code‑level, file paths, implementation details)
-Product / business‑focused (high‑level, conceptual, minimal code)
These styles typically use:
-the same LLM provider
-the same API key
-the same underlying model
Example:
"models": [
{
"name": "engineering",
"provider": "openai",
"baseUrl": "...",
"model": "...",
"apiKey": { "env": "LLM_API_KEY" },
"instructions": "Answer as an engineer with detailed code references..."
},
{
"name": "product",
"provider": "openai",
"baseUrl": "...",
"model": "...",
"apiKey": { "env": "LLM_API_KEY" },
"instructions": "Answer at a high level for non-engineering audiences..."
}
]
Request support for defining custom instructions per LLM configuration, allowing multiple logical “personas” (e.g. engineering vs product) to be configured against the same underlying LLM provider and API key.
This would enable teams to reuse a single LLM backend while offering different response styles through Sourcebot (web UI and MCP).
Problem Statement
Sourcebot supports configuring one or more language models, but in practice:
Teams often want multiple response styles, e.g.:
-Engineering‑focused (code‑level, file paths, implementation details)
-Product / business‑focused (high‑level, conceptual, minimal code)
These styles typically use:
-the same LLM provider
-the same API key
-the same underlying model
Example:
"models": [
{
"name": "engineering",
"provider": "openai",
"baseUrl": "...",
"model": "...",
"apiKey": { "env": "LLM_API_KEY" },
"instructions": "Answer as an engineer with detailed code references..."
},
{
"name": "product",
"provider": "openai",
"baseUrl": "...",
"model": "...",
"apiKey": { "env": "LLM_API_KEY" },
"instructions": "Answer at a high level for non-engineering audiences..."
}
]