gen_ai_hub.proxy.gen_ai_hub_proxy package
- class GenAIHubProxyClient
Bases:
BaseProxyClientGenAIHubProxyClient is a proxy client for interacting with the GenAI Hub.
- classmethod add_foundation_model_scenario(scenario_id, config_names=None, prediction_url_suffix=None, model_name_parameter='model_name')
Add a foundational model scenario to the client.
- Parameters:
scenario_id (str) -- the scenario ID.
config_names (Optional[List[str]], optional) -- list of configuration names, defaults to None
prediction_url_suffix (Optional[str], optional) -- prediction URL suffix, defaults to None
model_name_parameter (str, optional) -- model name parameter, defaults to 'model_name'
- classmethod for_profile(profile=None)
Create a GenAIHubProxyClient instance for the given profile.
- Parameters:
profile (str, optional) -- Profile name, defaults to None
- Returns:
GenAIHubProxyClient instance.
- Return type:
- classmethod init_client(data)
Initialize the client with the provided data.
- Parameters:
data (Any) -- Input data for client initialization.
- Returns:
Initialized data.
- Return type:
Any
- classmethod set_default_values(**kwargs)
Set default values for the client.
- get_additional_headers()
Get only the additional headers (instance-level and request-level).
- Returns:
Additional headers.
- Return type:
Dict[str, str]
- get_ai_core_token()
Get the AI core token for authentication.
- Returns:
AI core token.
- Return type:
str
- get_deployments()
Get the list of deployments.
- Returns:
List of deployments.
- Return type:
List[Deployment]
- get_request_header()
Get the request headers for requests made by the client.
- Returns:
Request headers.
- Return type:
Dict[str, str]
- model_post_init(context, /)
This function is meant to behave like a BaseModel method to initialise private attributes.
It takes context as an argument since that's what pydantic-core passes when calling it.
- Args:
self: The BaseModel instance. context: The context.
- Parameters:
self (BaseModel)
context (Any)
- Return type:
None
- select_deployment(raise_on_multiple=False, **search_key_value)
- Parameters:
raise_on_multiple (bool)
- set_headers_addition(headers)
Set additional headers for requests made by the client.
- Parameters:
headers (Dict[str, str]) -- Headers to add.
- update_deployments()
Update the list of deployments from the GenAI Hub.
- Returns:
List of updated deployments.
- Return type:
List[Deployment]
- AI_CLIENT_TYPE_VAL: ClassVar[str] = 'GenAI Hub SDK (Python)'
- ai_core_client: AICoreV2Client | None
- auth_url: str | None
- base_url: str | None
- client_id: str | None
- client_secret: str | None
- default_values: ClassVar[Dict[str, Any]] = {}
- property deployment_class: Type[Deployment]
- property deployments: List[Deployment]
- foundational_model_scenarios: ClassVar[List[FoundationalModelScenario]] = [FoundationalModelScenario(scenario_id='foundation-models', config_names=['*'], model_name_parameter='model_name', prediction_url_suffix=None)]
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'allow', 'protected_namespaces': ()}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- on_invalid_deployments: ClassVar[InvalidDeploymentBehavior] = 'warn'
- property request_header: Dict[str, Any]
- resource_group: str | None
- temporary_headers_addition(headers)
Context manager to temporarily add headers to requests made by the GenAIHubProxyClient.
- Parameters:
headers (Dict[str, str]) -- Headers to add temporarily.
Submodules
gen_ai_hub.proxy.gen_ai_hub_proxy.client module
- class Deployment
Bases:
BaseDeploymentDeployment class represents a deployment of a foundational model in the GenAI Hub.
- classmethod get_model_identification_kwargs()
Get model identification keywords.
- Returns:
Tuple of model identification keywords.
- Return type:
Tuple[str]
- additional_request_body_kwargs()
- Return type:
Dict[str, Any]
- additonal_parameters: Dict[str, str]
- config_id: str
- config_name: str
- created_at: datetime
- custom_prediction_suffix: str | None
- deployment_id: str
- model_config: ClassVar[ConfigDict] = {'protected_namespaces': ()}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_name: str
- property prediction_url
- prediction_urls: ClassVar[PredictionURLs] = <gen_ai_hub.proxy.core.utils.PredictionURLs object>
- Parameters:
model_name (str)
url (str)
fixed_suffix (Optional[str])
- Return type:
str
- url: str
- class FoundationalModelScenario
Bases:
BaseModelRepresents a foundational model scenario in the GenAI Hub.
- classmethod adjust(data)
Adjust input data before model initialization.
- Parameters:
data (Any) -- Input data to adjust.
- Returns:
Adjusted data.
- Return type:
Any
- config_names: List[str] | str | None
- model_config: ClassVar[ConfigDict] = {'protected_namespaces': ()}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- model_name_parameter: str
- prediction_url_suffix: str | None
- scenario_id: str
- class GenAIHubProxyClient
Bases:
BaseProxyClientGenAIHubProxyClient is a proxy client for interacting with the GenAI Hub.
- classmethod add_foundation_model_scenario(scenario_id, config_names=None, prediction_url_suffix=None, model_name_parameter='model_name')
Add a foundational model scenario to the client.
- Parameters:
scenario_id (str) -- the scenario ID.
config_names (Optional[List[str]], optional) -- list of configuration names, defaults to None
prediction_url_suffix (Optional[str], optional) -- prediction URL suffix, defaults to None
model_name_parameter (str, optional) -- model name parameter, defaults to 'model_name'
- classmethod for_profile(profile=None)
Create a GenAIHubProxyClient instance for the given profile.
- Parameters:
profile (str, optional) -- Profile name, defaults to None
- Returns:
GenAIHubProxyClient instance.
- Return type:
- classmethod init_client(data)
Initialize the client with the provided data.
- Parameters:
data (Any) -- Input data for client initialization.
- Returns:
Initialized data.
- Return type:
Any
- classmethod set_default_values(**kwargs)
Set default values for the client.
- get_additional_headers()
Get only the additional headers (instance-level and request-level).
- Returns:
Additional headers.
- Return type:
Dict[str, str]
- get_ai_core_token()
Get the AI core token for authentication.
- Returns:
AI core token.
- Return type:
str
- get_deployments()
Get the list of deployments.
- Returns:
List of deployments.
- Return type:
List[Deployment]
- get_request_header()
Get the request headers for requests made by the client.
- Returns:
Request headers.
- Return type:
Dict[str, str]
- model_post_init(context, /)
This function is meant to behave like a BaseModel method to initialise private attributes.
It takes context as an argument since that's what pydantic-core passes when calling it.
- Args:
self: The BaseModel instance. context: The context.
- Parameters:
self (BaseModel)
context (Any)
- Return type:
None
- select_deployment(raise_on_multiple=False, **search_key_value)
- Parameters:
raise_on_multiple (bool)
- set_headers_addition(headers)
Set additional headers for requests made by the client.
- Parameters:
headers (Dict[str, str]) -- Headers to add.
- update_deployments()
Update the list of deployments from the GenAI Hub.
- Returns:
List of updated deployments.
- Return type:
List[Deployment]
- AI_CLIENT_TYPE_VAL: ClassVar[str] = 'GenAI Hub SDK (Python)'
- ai_core_client: AICoreV2Client | None
- auth_url: str | None
- base_url: str | None
- client_id: str | None
- client_secret: str | None
- default_values: ClassVar[Dict[str, Any]] = {}
- property deployment_class: Type[Deployment]
- property deployments: List[Deployment]
- foundational_model_scenarios: ClassVar[List[FoundationalModelScenario]] = [FoundationalModelScenario(scenario_id='foundation-models', config_names=['*'], model_name_parameter='model_name', prediction_url_suffix=None)]
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'allow', 'protected_namespaces': ()}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- on_invalid_deployments: ClassVar[InvalidDeploymentBehavior] = 'warn'
- property request_header: Dict[str, Any]
- resource_group: str | None
- class GenAIHubRestClient
Bases:
objectREST client with automatic header injection.
This client wraps the AI Core rest_client and ensures that all requests include: - Instance-level headers (set via proxy_client.set_headers_addition) - Request-level headers (set via temporary_headers_addition context manager)
- Parameters:
proxy_client -- The GenAIHubProxyClient instance to get the rest_client and headers from.
- __init__(proxy_client)
Initialize the GenAIHubRestClient.
- Parameters:
proxy_client (GenAIHubProxyClient) -- The GenAIHubProxyClient instance to get the rest_client and headers from.
- delete(path, **kwargs)
Send a DELETE request with injected headers.
- Parameters:
path (str) -- The API path.
kwargs -- Additional arguments to pass to the underlying rest_client.
- Returns:
The response from the rest_client.
- get(path, **kwargs)
Send a GET request with injected headers.
- Parameters:
path (str) -- The API path.
kwargs -- Additional arguments to pass to the underlying rest_client.
- Returns:
The response from the rest_client.
- patch(path, **kwargs)
Send a PATCH request with injected headers.
- Parameters:
path (str) -- The API path.
kwargs -- Additional arguments to pass to the underlying rest_client.
- Returns:
The response from the rest_client.
- post(path, **kwargs)
Send a POST request with injected headers.
- Parameters:
path (str) -- The API path.
kwargs -- Additional arguments to pass to the underlying rest_client.
- Returns:
The response from the rest_client.
- class InvalidDeploymentBehavior
Bases:
str,Enum- __new__(value)
- ignore = 'ignore'
- raise_error = 'raise_error'
- warn = 'warn'
- camel_to_snake(name)
Convert camelCase or PascalCase string to snake_case.
- Parameters:
name (str) -- Input string in camelCase or PascalCase.
- Returns:
String converted to snake_case.
- Return type:
str
- config_parameters(model_name_parameter, ai_core_client, deployment)
Get configuration parameters for a deployment.
- Parameters:
model_name_parameter (str) -- the model name parameter.
ai_core_client (AICoreV2Client) -- the AI core client.
deployment (Deployment) -- the deployment.
- Returns:
Dictionary with model name and additional parameters.
- Return type:
Dict[str, Any]
- temporary_headers_addition(headers)
Context manager to temporarily add headers to requests made by the GenAIHubProxyClient.
- Parameters:
headers (Dict[str, str]) -- Headers to add temporarily.