BYOM
Securely connect and manage your LLM providers with custom API credentials.
Overview
Bring Your Own Model (BYOM) provides a secure way to add, connect, and manage API credentials for your preferred Large Language Model (LLM) providers. By default, Kognitos uses OpenAI and Gemini, but BYOM allows you to configure alternative or custom providers at the agent level.
Supported Providers
BYOM supports the following providers and models:
OpenAI
GPT-4o, GPT-4o Mini, gpt-4o-realtime-preview, gpt-4.1-mini
Google Gemini
Gemini 2.5 Pro, Gemini 2.0 Flash, gemini-2.5-flash, gemini-2.5-flash-lite
Google Vertex AI
Gemini 2.5 Pro, Gemini 2.0 Flash, gemini-2.5-flash, gemini-2.5-flash-lite
Anthropic Claude
claude-opus-4, claude-sonnet-4.5, claude-haiku-4.5
Custom
Any custom LLM model not listed above
Configuration
Follow these steps to configure an LLM provider in Kognitos:
Note: BYOM must be configured separately for each agent.
Usage
After configuring models through BYOM, automations will use your stored credentials when executing LLM procedures.
Standard Providers
For standard providers like OpenAI, Google Gemini, or Anthropic, reference the model directly in your automation. The example below uses the extract data procedure with an OpenAI model:
extract data from the document
the dpi is 144
the openai model is "gpt-4.1-mini"
the first field is "po number"
the first field's format is "string"
the first field's rule is "the po number has 10 characters"Custom Providers
For custom models, specify the provider ID, indented within the LLM procedure:
the model provider is "provider-id"For example:
extract data from the document
the model provider is "custom-custom-llm-1764100945699-r6jisd"
the first field is "invoice number"
the first field's format is "string"Retrieve the ID from the BYOM interface after saving a custom LLM configuration:

Last updated
Was this helpful?




