Oracle Cloud Infrastructure (OCI)
LiteLLM supports the following models for OCI on-demand GenAI API.
Check the OCI Models List to see if the model is available for your region.
Supported Modelsโ
Meta Llama Modelsโ
meta.llama-4-maverick-17b-128e-instruct-fp8meta.llama-4-scout-17b-16e-instructmeta.llama-3.3-70b-instructmeta.llama-3.2-90b-vision-instructmeta.llama-3.1-405b-instruct
xAI Grok Modelsโ
xai.grok-4xai.grok-3xai.grok-3-fastxai.grok-3-minixai.grok-3-mini-fast
Cohere Modelsโ
cohere.command-latestcohere.command-a-03-2025cohere.command-plus-latest
Authenticationโ
LiteLLM supports two authentication methods for OCI:
Method 1: Manual Credentialsโ
Provide individual OCI credentials directly to LiteLLM. Follow the official Oracle tutorial to create a signing key and obtain the following parameters:
userfingerprinttenancyregionkey_fileorkeycompartment_id
This is the default method for LiteLLM AI Gateway (LLM Proxy) access to OCI GenAI models.
Method 2: OCI SDK Signerโ
Use an OCI SDK Signer object for authentication. This method:
- Leverages the official OCI SDK for signing
 - Supports additional authentication methods (instance principals, workload identity, etc.)
 
To use this method, install the OCI SDK:
pip install oci
This method is an alternative when using the LiteLLM SDK on Oracle Cloud Infrastructure (instances or Oracle Kubernetes Engine).
Usageโ
- Manual Credentials
 - OCI SDK Signer
 
Input the parameters obtained from the OCI signing key creation process into the completion function:
import os
from litellm import completion
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    oci_region=<your_oci_region>,
    oci_user=<your_oci_user>,
    oci_fingerprint=<your_oci_fingerprint>,
    oci_tenancy=<your_oci_tenancy>,
    oci_serving_mode="ON_DEMAND",  # Optional, default is "ON_DEMAND". Other option is "DEDICATED"
    # Provide either the private key string OR the path to the key file:
    # Option 1: pass the private key as a string
    oci_key=<string_with_content_of_oci_key>,
    # Option 2: pass the private key file path
    # oci_key_file="<path/to/oci_key.pem>",
    oci_compartment_id=<oci_compartment_id>,
)
print(response)
Use the OCI SDK Signer for authentication:
from litellm import completion
from oci.signer import Signer
# Create an OCI Signer
signer = Signer(
    tenancy="ocid1.tenancy.oc1..",
    user="ocid1.user.oc1..",
    fingerprint="xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx",
    private_key_file_location="~/.oci/key.pem",
    # Or use private_key_content="<your_private_key_content>"
)
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    oci_signer=signer,
    oci_region="us-chicago-1",  # Optional, defaults to us-ashburn-1
    oci_serving_mode="ON_DEMAND",  # Optional, default is "ON_DEMAND". Other option is "DEDICATED"
    oci_compartment_id="<oci_compartment_id>",
)
print(response)
Alternative: Use OCI Config File
The OCI SDK can automatically load credentials from ~/.oci/config:
from litellm import completion
from oci.config import from_file
from oci.signer import Signer
# Load config from file
config = from_file("~/.oci/config", "DEFAULT")  # "DEFAULT" is the profile name
signer = Signer(
    tenancy=config["tenancy"],
    user=config["user"],
    fingerprint=config["fingerprint"],
    private_key_file_location=config["key_file"],
    pass_phrase=config.get("pass_phrase")  # Optional if key is encrypted
)
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    oci_signer=signer,
    oci_region=config["region"],
    oci_compartment_id="<oci_compartment_id>",
)
print(response)
Instance Principal Authentication
For applications running on OCI compute instances:
from litellm import completion
from oci.auth.signers import InstancePrincipalsSecurityTokenSigner
oci.auth.signers.get_oke_workload_identity_resource_principal_signer()
# Use instance principal authentication
signer = InstancePrincipalsSecurityTokenSigner()
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    oci_signer=signer,
    oci_region="us-chicago-1",
    oci_compartment_id="<oci_compartment_id>",
)
print(response)
Use workload identity authentication
For applications running in Oracle Kubernetes Engine (OKE):
from litellm import completion
from oci.auth.signers import get_oke_workload_identity_resource_principal_signer
# Use instance principal authentication
signer = get_oke_workload_identity_resource_principal_signer()
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    oci_signer=signer,
    oci_region="us-chicago-1",
    oci_compartment_id="<oci_compartment_id>",
)
print(response)
Usage - Streamingโ
Just set stream=True when calling completion.
- Manual Credentials
 - OCI SDK Signer
 
import os
from litellm import completion
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    stream=True,
    oci_region=<your_oci_region>,
    oci_user=<your_oci_user>,
    oci_fingerprint=<your_oci_fingerprint>,
    oci_tenancy=<your_oci_tenancy>,
    oci_serving_mode="ON_DEMAND",  # Optional, default is "ON_DEMAND". Other option is "DEDICATED"
    # Provide either the private key string OR the path to the key file:
    # Option 1: pass the private key as a string
    oci_key=<string_with_content_of_oci_key>,
    # Option 2: pass the private key file path
    # oci_key_file="<path/to/oci_key.pem>",
    oci_compartment_id=<oci_compartment_id>,
)
for chunk in response:
    print(chunk["choices"][0]["delta"]["content"])  # same as openai format
from litellm import completion
from oci.signer import Signer
signer = Signer(
    tenancy="ocid1.tenancy.oc1..",
    user="ocid1.user.oc1..",
    fingerprint="xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx",
    private_key_file_location="~/.oci/key.pem",
)
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(
    model="oci/xai.grok-4",
    messages=messages,
    stream=True,
    oci_signer=signer,
    oci_region="us-chicago-1",
    oci_compartment_id="<oci_compartment_id>",
)
for chunk in response:
    print(chunk["choices"][0]["delta"]["content"])  # same as openai format
Usage Examples by Model Typeโ
Using Cohere Modelsโ
- OCI SDK Signer
 - Manual Credentials
 
from litellm import completion
from oci.signer import Signer
signer = Signer(
    tenancy="ocid1.tenancy.oc1..",
    user="ocid1.user.oc1..",
    fingerprint="xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx",
    private_key_file_location="~/.oci/key.pem",
)
messages = [{"role": "user", "content": "Explain quantum computing"}]
response = completion(
    model="oci/cohere.command-latest",
    messages=messages,
    oci_signer=signer,
    oci_region="us-chicago-1",
    oci_compartment_id="<oci_compartment_id>",
)
print(response)
from litellm import completion
messages = [{"role": "user", "content": "Explain quantum computing"}]
response = completion(
    model="oci/cohere.command-latest",
    messages=messages,
    oci_region="us-chicago-1",
    oci_user=<your_oci_user>,
    oci_fingerprint=<your_oci_fingerprint>,
    oci_tenancy=<your_oci_tenancy>,
    oci_key=<string_with_content_of_oci_key>,
    oci_compartment_id=<oci_compartment_id>,
)
print(response)