BYOM

Securely connect and manage your LLM providers with custom API credentials.

Overview

Bring Your Own Model (BYOM) provides a secure way to add, connect, and manage API credentials for your preferred Large Language Model (LLM) providers. By default, Kognitos uses OpenAI and Gemini, but BYOM allows you to configure alternative or custom providers at the agent level.

Supported Providers

BYOM supports the following providers and models:

Provider
Supported Models

OpenAI

GPT-4o, GPT-4o Mini, gpt-4o-realtime-preview, gpt-4.1-mini

Google Gemini

Gemini 2.5 Pro, Gemini 2.0 Flash, gemini-2.5-flash, gemini-2.5-flash-lite

Google Vertex AI

Gemini 2.5 Pro, Gemini 2.0 Flash, gemini-2.5-flash, gemini-2.5-flash-lite

Anthropic Claude

claude-opus-4, claude-sonnet-4.5, claude-haiku-4.5

Custom

Any custom LLM model not listed above

Configuration

Follow these steps to configure an LLM provider in Kognitos:

1

Click the user icon 👤 in the top right corner and select BYOM from the dropdown menu.

2

Choose a Provider

Under Add an LLM provider, click Set Up next to the provider you wish to configure (e.g., OpenAI, Google Gemini, Anthropic Claude, Custom, etc.).

3

Enter API Credentials

Enter the required credentials for the selected provider (e.g., API key, service account JSON, project ID, region). Required fields vary by provider.

4

Save Configuration

Click Save Changes to apply your configuration.

Usage

After configuring models through BYOM, automations will use your stored credentials when executing LLM procedures.

Standard Providers

For standard providers like OpenAI, Google Gemini, or Anthropic, reference the model directly in your automation. The example below uses the extract data procedure with an OpenAI model:

extract data from the document
    the dpi is 144
    the openai model is "gpt-4.1-mini"
    the first field is "po number"
    the first field's format is "string"
    the first field's rule is "the po number has 10 characters"

Custom Providers

For custom models, specify the provider ID, indented within the LLM procedure:

the model provider is "provider-id"

For example:

extract data from the document
    the model provider is "custom-custom-llm-1764100945699-r6jisd"
    the first field is "invoice number"
    the first field's format is "string"

Retrieve the ID from the BYOM interface after saving a custom LLM configuration:

Last updated

Was this helpful?