You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/self-hosting/govern/plane-ai.md
+5-1Lines changed: 5 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,6 +41,10 @@ Supported models:
41
41
42
42
You can provide API keys for both OpenAI and Anthropic, making all models available to users. If you provide only one key, users will only have access to that provider's models.
43
43
44
+
:::tip
45
+
If you need to use an LLM that isn't from OpenAI or Anthropic — for example, an open-source model or a regional provider for compliance reasons — you can proxy it through [LiteLLM](https://docs.litellm.ai). LiteLLM exposes any LLM behind an OpenAI-compatible API, which Plane can then connect to using the `CUSTOM_LLM_*` variables with `CUSTOM_LLM_PROVIDER=openai`.
46
+
:::
47
+
44
48
#### Custom models (self-hosted or third-party)
45
49
46
50
Plane AI supports custom models through two backends:
@@ -51,7 +55,7 @@ Plane AI supports custom models through two backends:
51
55
One custom model can be configured alongside your public provider keys.
52
56
53
57
::: warning
54
-
The custom model should have at least 100 billion parameters for all Plane AI features to work reliably. Larger, more capable models yield better results.
58
+
The custom model should have at least 1 trillion parameters for all Plane AI features to work reliably. Larger, more capable models yield better results.
0 commit comments