pyrit.prompt_target.OpenAITarget#

class OpenAITarget(*, model_name: str | None = None, endpoint: str | None = None, api_key: str | Callable[[], str | Awaitable[str]] | None = None, headers: str | None = None, max_requests_per_minute: int | None = None, httpx_client_kwargs: dict[str, Any] | None = None, underlying_model: str | None = None, capabilities: TargetCapabilities | None = None)[source]#

Bases: PromptTarget

Abstract base class for OpenAI-based prompt targets.

This class provides common functionality for interacting with OpenAI API endpoints, handling authentication, rate limiting, and request/response processing.

Read more about the various models here: https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models.

__init__(*, model_name: str | None = None, endpoint: str | None = None, api_key: str | Callable[[], str | Awaitable[str]] | None = None, headers: str | None = None, max_requests_per_minute: int | None = None, httpx_client_kwargs: dict[str, Any] | None = None, underlying_model: str | None = None, capabilities: TargetCapabilities | None = None) None[source]#

Initialize an instance of OpenAITarget.

Parameters:
  • model_name (str, Optional) – The name of the model (or name of deployment in Azure). If no value is provided, the environment variable will be used (set by subclass).

  • endpoint (str, Optional) – The target URL for the OpenAI service.

  • api_key (str | Callable[[], str | Awaitable[str]], Optional) – The API key for accessing the OpenAI service, or a callable that returns an access token (sync or async). For Azure endpoints, if no API key is provided (via parameter or environment variable), Entra ID authentication is used automatically. You can also explicitly pass a token provider from pyrit.auth (e.g., get_azure_openai_auth(endpoint) for async, or get_azure_token_provider(scope) for sync). Synchronous token providers are automatically wrapped to work with async clients. Defaults to the target-specific API key environment variable.

  • headers (str, Optional) – Extra headers of the endpoint (JSON).

  • max_requests_per_minute (int, Optional) – Number of requests the target can handle per minute before hitting a rate limit. The number of requests sent to the target will be capped at the value provided.

  • httpx_client_kwargs (dict, Optional) – Additional kwargs to be passed to the httpx.AsyncClient() constructor.

  • underlying_model (str, Optional) – The underlying model name (e.g., “gpt-4o”) used solely for target identifier purposes. This is useful when the deployment name in Azure differs from the actual model. If not provided, will attempt to fetch from environment variable. If it is not there either, the identifier “model_name” attribute will use the model_name. Defaults to None.

  • capabilities (TargetCapabilities, Optional) – Override the default capabilities for this target instance. If None, uses the class-level defaults. Defaults to None.

Raises:

ValueError – If no API key is provided and the endpoint is not an Azure endpoint.

Methods

__init__(*[, model_name, endpoint, api_key, ...])

Initialize an instance of OpenAITarget.

dispose_db_engine()

Dispose database engine to release database connections and resources.

get_identifier()

Get the component's identifier, building it lazily on first access.

is_json_response_supported()

Abstract method to determine if JSON response format is supported by the target.

send_prompt_async(*, message)

Send a normalized prompt async to the prompt target.

set_model_name(*, model_name)

Set the model name for this target.

Attributes

ADDITIONAL_REQUEST_HEADERS

capabilities

The capabilities of this target instance.

supports_multi_turn

Whether this target supports multi-turn conversations.

model_name_environment_variable

endpoint_environment_variable

api_key_environment_variable

underlying_model_environment_variable

supported_converters

A list of PromptConverters that are supported by the prompt target.

ADDITIONAL_REQUEST_HEADERS: str = 'OPENAI_ADDITIONAL_REQUEST_HEADERS'#
api_key_environment_variable: str#
endpoint_environment_variable: str#
abstract is_json_response_supported() bool[source]#

Abstract method to determine if JSON response format is supported by the target.

Returns:

True if JSON response is supported, False otherwise.

Return type:

bool

model_name_environment_variable: str#
underlying_model_environment_variable: str#