GÄLLER FÖR: Azure CLI ml extension v2 (current)
Azure CLI ml extension v2 (current) Python SDK azure-ai-ml v2 (aktuell)
Python SDK azure-ai-ml v2 (aktuell)
I den här artikeln får du lära dig hur du ansluter till Azure-datalagringstjänster med Azure Machine Learning-datalager.
Förutsättningar
Kommentar
Machine Learning-datalager skapar inte de underliggande lagringskontoresurserna. I stället länkar de ett befintligt  lagringskonto för Machine Learning-användning. Machine Learning-datalager krävs inte. Om du har åtkomst till underliggande data kan du använda lagrings-URI:er direkt.
 
Skapa ett Azure Blob-datalager
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureBlobDatastore(
    name="",
    description="",
    account_name="",
    container_name=""
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml.entities import AccountKeyConfiguration
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureBlobDatastore(
    name="blob_protocol_example",
    description="Datastore pointing to a blob container using https protocol.",
    account_name="mytestblobstore",
    container_name="data-container",
    protocol="https",
    credentials=AccountKeyConfiguration(
        account_key="aaaaaaaa-0b0b-1c1c-2d2d-333333333333"
    ),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureBlobDatastore
from azure.ai.ml.entities import SasTokenConfiguration
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureBlobDatastore(
    name="blob_sas_example",
    description="Datastore pointing to a blob container using SAS token.",
    account_name="mytestblobstore",
    container_name="data-container",
    credentials=SasTokenConfiguration(
        sas_token= "?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
    ),
)
ml_client.create_or_update(store)
Skapa följande YAML-fil (uppdatera lämpliga värden):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: my_blob_ds # add your datastore name here
type: azure_blob
description: here is a description # add a datastore description here
account_name: my_account_name # add the storage account name here
container_name: my_container_name # add the storage container name here
Skapa Machine Learning-datalagringen i Azure CLI:
az ml datastore create --file my_blob_datastore.yml
Skapa den här YAML-filen (uppdatera lämpliga värden):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: blob_example
type: azure_blob
description: Datastore pointing to a blob container.
account_name: mytestblobstore
container_name: data-container
credentials:
  account_key: aaaaaaaa-0b0b-1c1c-2d2d-333333333333
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_blob_datastore.yml
Skapa den här YAML-filen (uppdatera lämpliga värden):
# my_blob_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureBlob.schema.json
name: blob_sas_example
type: azure_blob
description: Datastore pointing to a blob container using SAS token.
account_name: mytestblobstore
container_name: data-container
credentials:
  sas_token: "?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_blob_datastore.yml
 
Skapa ett Azure Data Lake Storage Gen2-datalager
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureDataLakeGen2Datastore(
    name="",
    description="",
    account_name="",
    filesystem=""
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen2Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureDataLakeGen2Datastore(
    name="adls_gen2_example",
    description="Datastore pointing to an Azure Data Lake Storage Gen2.",
    account_name="mytestdatalakegen2",
    filesystem="my-gen2-container",
     credentials=ServicePrincipalCredentials(
        tenant_id= "bbbbcccc-1111-dddd-2222-eeee3333ffff",
        client_id= "44445555-eeee-6666-ffff-7777aaaa8888",
        client_secret= "Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4",
    ),
)
ml_client.create_or_update(store)
Skapa den här YAML-filen (uppdatera värdena):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen2.schema.json
name: adls_gen2_credless_example
type: azure_data_lake_gen2
description: Credential-less datastore pointing to an Azure Data Lake Storage Gen2 instance.
account_name: mytestdatalakegen2
filesystem: my-gen2-container
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_adls_datastore.yml
Skapa den här YAML-filen (uppdatera värdena):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen2.schema.json
name: adls_gen2_example
type: azure_data_lake_gen2
description: Datastore pointing to an Azure Data Lake Storage Gen2 instance.
account_name: mytestdatalakegen2
filesystem: my-gen2-container
credentials:
  tenant_id: bbbbcccc-1111-dddd-2222-eeee3333ffff
  client_id: 44445555-eeee-6666-ffff-7777aaaa8888
  client_secret: Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_adls_datastore.yml
 
Skapa ett Azure Files-datalager
from azure.ai.ml.entities import AzureFileDatastore
from azure.ai.ml.entities import AccountKeyConfiguration
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureFileDatastore(
    name="file_example",
    description="Datastore pointing to an Azure File Share.",
    account_name="mytestfilestore",
    file_share_name="my-share",
    credentials=AccountKeyConfiguration(
        account_key= "aaaaaaaa-0b0b-1c1c-2d2d-333333333333"
    ),
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureFileDatastore
from azure.ai.ml.entities import SasTokenConfiguration
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureFileDatastore(
    name="file_sas_example",
    description="Datastore pointing to an Azure File Share using SAS token.",
    account_name="mytestfilestore",
    file_share_name="my-share",
    credentials=SasTokenConfiguration(
        sas_token="?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
    ),
)
ml_client.create_or_update(store)
Skapa den här YAML-filen (uppdatera värdena):
# my_files_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureFile.schema.json
name: file_example
type: azure_file
description: Datastore pointing to an Azure File Share.
account_name: mytestfilestore
file_share_name: my-share
credentials:
  account_key: aaaaaaaa-0b0b-1c1c-2d2d-333333333333
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_files_datastore.yml
Skapa den här YAML-filen (uppdatera värdena):
# my_files_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureFile.schema.json
name: file_sas_example
type: azure_file
description: Datastore pointing to an Azure File Share using an SAS token.
account_name: mytestfilestore
file_share_name: my-share
credentials:
  sas_token: "?xx=A1bC2dE3fH4iJ5kL6mN7oP8qR9sT0u&xx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1wx&xx=Ff6Gg~7Hh8.-Ii9Jj0Kk1Ll2Mm3Nn4_Oo5Pp6Qq7&xx=N7oP8qR9sT0uV1wX2yZ3aB4cD5eF6g&xxx=Ee5Ff~6Gg7.-Hh8Ii9Jj0Kk1Ll2Mm3_Nn4Oo5Pp6&xxx=C2dE3fH4iJ5kL6mN7oP8qR9sT0uV1w"
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_files_datastore.yml
 
Skapa ett Azure Data Lake Storage Gen1-datalager
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureDataLakeGen1Datastore(
    name="",
    store_name="",
    description="",
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import AzureDataLakeGen1Datastore
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = AzureDataLakeGen1Datastore(
    name="adls_gen1_example",
    description="Datastore pointing to an Azure Data Lake Storage Gen1.",
    store_name="mytestdatalakegen1",
    credentials=ServicePrincipalCredentials(
        tenant_id= "bbbbcccc-1111-dddd-2222-eeee3333ffff",
        client_id= "44445555-eeee-6666-ffff-7777aaaa8888",
        client_secret= "Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4",
    ),
)
ml_client.create_or_update(store)
Skapa den här YAML-filen (uppdatera värdena):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen1.schema.json
name: alds_gen1_credless_example
type: azure_data_lake_gen1
description: Credential-less datastore pointing to an Azure Data Lake Storage Gen1 instance.
store_name: mytestdatalakegen1
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_adls_datastore.yml
Skapa den här YAML-filen (uppdatera värdena):
# my_adls_datastore.yml
$schema: https://azuremlschemas.azureedge.net/latest/azureDataLakeGen1.schema.json
name: adls_gen1_example
type: azure_data_lake_gen1
description: Datastore pointing to an Azure Data Lake Storage Gen1 instance.
store_name: mytestdatalakegen1
credentials:
  tenant_id: bbbbcccc-1111-dddd-2222-eeee3333ffff
  client_id: 44445555-eeee-6666-ffff-7777aaaa8888
  client_secret: Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_adls_datastore.yml
 
Skapa ett OneLake-datalager (Microsoft Fabric) (förhandsversion)
I det här avsnittet beskrivs olika alternativ för att skapa ett OneLake-datalager. OneLake-datalagringen är en del av Microsoft Fabric. Just nu stöder Machine Learning anslutning till Microsoft Fabric Lakehouse-artefakter i mappen "Files" som innehåller mappar eller filer och Amazon S3-genvägar. Mer information om lakehouses finns i Vad är ett sjöhus i Microsoft Fabric?.
Skapande av OneLake-datalager kräver följande information från din Microsoft Fabric-instans:
- Slutpunkt
- Arbetsytans GUID
- Artefakt-GUID
Följande skärmbilder beskriver hämtningen av dessa nödvändiga informationsresurser från din Microsoft Fabric-instans.
              
              
               
              
              
              
            
Du hittar sedan "Endpoint", "Workspace GUID" och "Artifact GUID" i "URL" och "ABFS path" från sidan Egenskaper:
- 
              URL-format: https://{your_one_lake_endpoint}/{your_one_lake_artifact_guid}/Files
- 
              ABFS-sökvägsformat: abfss://{your_one_lake_workspace_guid}@//Files
              
              
               
              
              
              
            
Skapa ett OneLake-datalager
from azure.ai.ml.entities import OneLakeDatastore, OneLakeArtifact
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = OneLakeDatastore(
    name="onelake_example_id",
    description="Datastore pointing to an Microsoft fabric artifact.",
    one_lake_workspace_name="bbbbbbbb-7777-8888-9999-cccccccccccc", #{your_one_lake_workspace_guid}
    endpoint="msit-onelake.dfs.fabric.microsoft.com" #{your_one_lake_endpoint}
    artifact = OneLakeArtifact(
        name="cccccccc-8888-9999-0000-dddddddddddd/Files", #{your_one_lake_artifact_guid}/Files
        type="lake_house"
    )
)
ml_client.create_or_update(store)
from azure.ai.ml.entities import OneLakeDatastore, OneLakeArtifact
from azure.ai.ml.entities._datastore.credentials import ServicePrincipalCredentials
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
store = OneLakeDatastore(
    name="onelake_example_sp",
    description="Datastore pointing to an Microsoft fabric artifact.",
    one_lake_workspace_name="bbbbbbbb-7777-8888-9999-cccccccccccc", #{your_one_lake_workspace_guid}
    endpoint="msit-onelake.dfs.fabric.microsoft.com" #{your_one_lake_endpoint}
    artifact = OneLakeArtifact(
    name="cccccccc-8888-9999-0000-dddddddddddd/Files", #{your_one_lake_artifact_guid}/Files
    type="lake_house"
    )
    credentials=ServicePrincipalCredentials(
        tenant_id= "bbbbcccc-1111-dddd-2222-eeee3333ffff",
        client_id= "44445555-eeee-6666-ffff-7777aaaa8888",
        client_secret= "Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4",
    ),
)
ml_client.create_or_update(store)
Skapa följande YAML-fil (uppdatera värdena):
# my_onelake_datastore.yml
$schema: http://azureml/sdk-2-0/OneLakeDatastore.json
name: onelake_example_id
type: one_lake
description: Credential-less datastore pointing to a OneLake lakehouse.
one_lake_workspace_name: "eeeeffff-4444-aaaa-5555-bbbb6666cccc"
endpoint: "msit-onelake.dfs.fabric.microsoft.com"
artifact:
  type: lake_house
  name: "1111bbbb-22cc-dddd-ee33-ffffff444444/Files"
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_onelake_datastore.yml
Skapa följande YAML-fil (uppdatera värdena):
# my_onelakesp_datastore.yml
$schema: http://azureml/sdk-2-0/OneLakeDatastore.json
name: onelake_example_id
type: one_lake
description: Credential-less datastore pointing to a OneLake lakehouse.
one_lake_workspace_name: "eeeeffff-4444-aaaa-5555-bbbb6666cccc"
endpoint: "msit-onelake.dfs.fabric.microsoft.com"
artifact:
  type: lake_house
  name: "1111bbbb-22cc-dddd-ee33-ffffff444444/Files"
credentials:
  tenant_id: bbbbcccc-1111-dddd-2222-eeee3333ffff
  client_id: 44445555-eeee-6666-ffff-7777aaaa8888
  client_secret: Cc3Dd~4Ee5.-Ff6Gg7Hh8Ii9Jj0Kk1_Ll2Mm3Nn4
Skapa Machine Learning-datalagringen i CLI:
az ml datastore create --file my_onelakesp_datastore.yml
 
Nästa steg