1. Generating GCG Suffixes Using Azure Machine Learning#
This notebook shows how to generate GCG suffixes using Azure Machine Learning (AML), which consists of three main steps:
Connect to an Azure Machine Learning (AML) workspace.
Create AML Environment with the Python dependencies.
Submit a training job to AML.
Connect to Azure Machine Learning Workspace#
The workspace is the top-level resource for Azure Machine Learning (AML), providing a centralized place to work with all the artifacts you create when using AML. In this section, we will connect to the workspace in which the job will be run.
To connect to a workspace, we need identifier parameters - a subscription, resource group and workspace name. We will use these details in the MLClient
from azure.ai.ml
to get a handle to the required AML workspace. We use the default Azure authentication for this tutorial.
import os
from pyrit.common import IN_MEMORY, initialize_pyrit
initialize_pyrit(memory_db_type=IN_MEMORY)
# Enter details of your AML workspace
subscription_id = os.environ.get("AZURE_SUBSCRIPTION_ID")
resource_group = os.environ.get("AZURE_RESOURCE_GROUP")
workspace = os.environ.get("AZURE_ML_WORKSPACE_NAME")
print(workspace)
romanlutz
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
# Get a handle to the workspace
ml_client = MLClient(DefaultAzureCredential(), subscription_id, resource_group, workspace)
Create AML Environment#
To install the dependencies needed to run GCG, we create an AML environment from a Dockerfile.
from pathlib import Path
from azure.ai.ml.entities import BuildContext, Environment, JobResourceConfiguration
from pyrit.common.path import HOME_PATH
# Configure the AML environment with path to Dockerfile and dependencies
env_docker_context = Environment(
build=BuildContext(path=Path(HOME_PATH) / "pyrit" / "auxiliary_attacks" / "gcg" / "src"),
name="pyrit",
description="PyRIT environment created from a Docker context.",
)
# Create or update the AML environment
ml_client.environments.create_or_update(env_docker_context)
Uploading src (0.0 MBs): 100%|##########| 972/972 [00:00<00:00, 3544.26it/s]
Environment({'arm_type': 'environment_version', 'latest_version': None, 'image': None, 'intellectual_property': None, 'is_anonymous': False, 'auto_increment_version': False, 'auto_delete_setting': None, 'name': 'pyrit', 'description': 'PyRIT environment created from a Docker context.', 'tags': {}, 'properties': {'azureml.labels': 'latest'}, 'print_as_yaml': False, 'id': '/subscriptions/db1ba766-2ca3-42c6-a19a-0f0d43134a8c/resourceGroups/romanlutz/providers/Microsoft.MachineLearningServices/workspaces/romanlutz/environments/pyrit/versions/1', 'Resource__source_path': '', 'base_path': 'c:\\Users\\Roman\\git\\PyRIT\\doc\\code\\auxiliary_attacks', 'creation_context': <azure.ai.ml.entities._system_data.SystemData object at 0x000002404A431160>, 'serialize': <msrest.serialization.Serializer object at 0x000002404A33A9E0>, 'version': '1', 'conda_file': None, 'build': <azure.ai.ml.entities._assets.environment.BuildContext object at 0x000002404A2AB4D0>, 'inference_config': None, 'os_type': 'Linux', 'conda_file_path': None, 'path': None, 'datastore': None, 'upload_hash': None, 'translated_conda_file': None})
Submit Training Job to AML#
Finally, we configure the command to run the GCG algorithm. The entry file for the algorithm is run.py
, which takes several command line arguments, as shown below. We also have to specify the compute instance_type
to run the algorithm on. In our experience, a GPU instance with at least 32GB of vRAM is required. In the example below, we use Standard_ND40rs_v2.
Depending on the compute instance you use, you may encounter “out of memory” errors. In this case, we recommend training on a smaller model or lowering n_train_data
or batch_size
.
from azure.ai.ml import command
# Configure the command
job = command(
code=Path(HOME_PATH),
command="cd pyrit/auxiliary_attacks/gcg/experiments && python run.py --model_name ${{inputs.model_name}} --setup ${{inputs.setup}} --n_train_data ${{inputs.n_train_data}} --n_test_data ${{inputs.n_test_data}} --n_steps ${{inputs.n_steps}} --batch_size ${{inputs.batch_size}}",
inputs={
"model_name": "phi_3_mini",
"setup": "multiple",
"n_train_data": 25,
"n_test_data": 0,
"n_steps": 500,
"batch_size": 256,
},
environment=f"{env_docker_context.name}:{env_docker_context.version}",
environment_variables={"HUGGINGFACE_TOKEN": os.environ["HUGGINGFACE_TOKEN"]},
display_name="suffix_generation",
description="Generate a suffix for attacking LLMs.",
resources = JobResourceConfiguration(
instance_type="Standard_NC96ads_A100_v4",
instance_count=1,
)
)
# Submit the command
returned_job = ml_client.create_or_update(job)
Class AutoDeleteSettingSchema: This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.
Class AutoDeleteConditionSchema: This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.
Class BaseAutoDeleteSettingSchema: This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.
Class IntellectualPropertySchema: This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.
Class ProtectionLevelSchema: This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.
Class BaseIntellectualPropertySchema: This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.
Your file exceeds 100 MB. If you experience low speeds, latency, or broken connections, we recommend using the AzCopyv10 tool for this file transfer.
Example: azcopy copy 'C:\Users\Roman\git\PyRIT' 'https://romanlutz0437468309.blob.core.windows.net/3f52e8b9-0bac-4c48-9e4a-a92e85a582c4-kywc2l2m01nbvhup37vc5bwxx9/PyRIT'
See https://learn.microsoft.com/azure/storage/common/storage-use-azcopy-v10 for more information.
Uploading PyRIT (395.92 MBs): 100%|##########| 395920235/395920235 [02:44<00:00, 2413840.02it/s]
# Close connection
from pyrit.memory import CentralMemory
memory = CentralMemory.get_memory_instance()
memory.dispose_engine()