Skip to content

Commit c9b0ca8

Browse files
authored
Merge pull request #98559 from nibaccam/sql-datastore
Data | Sql datastore
2 parents 7be96b8 + 132c112 commit c9b0ca8

File tree

1 file changed

+65
-18
lines changed

1 file changed

+65
-18
lines changed

articles/machine-learning/service/how-to-access-data.md

Lines changed: 65 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -6,10 +6,10 @@ services: machine-learning
66
ms.service: machine-learning
77
ms.subservice: core
88
ms.topic: conceptual
9-
ms.author: sihhu
10-
author: MayMSFT
9+
ms.author: ylxiong
10+
author: YLXiong1125
1111
ms.reviewer: nibaccam
12-
ms.date: 11/04/2019
12+
ms.date: 12/10/2019
1313
ms.custom: seodec18
1414

1515
# Customer intent: As an experienced Python developer, I need to make my data in Azure storage available to my remote compute to train my machine learning models.
@@ -62,35 +62,82 @@ The information you need to populate the register() method can be found via the
6262
2. The **Overview** page provides information such as, the account name and container or file share name.
6363
3. For authentication information, like account key or SAS token, navigate to **Account Keys** under the **Settings** pane on the left.
6464

65-
>[IMPORTANT]
65+
> [!IMPORTANT]
6666
> If your storage account is in a VNET, only Azure blob datastore creation is supported. Set the parameter, `grant_workspace_access` to `True` to grant your workspace access to your storage account.
6767
68-
The following examples show you to register an Azure Blob Container or an Azure File Share as a datastore.
68+
The following examples show how to register an Azure Blob Container, an Azure File Share or an Azure SQL data as a datastore.
6969

7070
+ For an **Azure Blob Container Datastore**, use [`register_azure_blob-container()`](https://docs.microsoft.com/python/api/azureml-core/azureml.core.datastore(class)?view=azure-ml-py#register-azure-blob-container-workspace--datastore-name--container-name--account-name--sas-token-none--account-key-none--protocol-none--endpoint-none--overwrite-false--create-if-not-exists-false--skip-validation-false--blob-cache-timeout-none--grant-workspace-access-false--subscription-id-none--resource-group-none-)
7171

7272
The following code creates and registers the datastore, `my_datastore`, to the workspace, `ws`. This datastore accesses the Azure blob container, `my_blob_container`, on the Azure storage account, `my_storage_account` using the provided account key.
7373

7474
```Python
75-
datastore = Datastore.register_azure_blob_container(workspace=ws,
76-
datastore_name='my_datastore',
77-
container_name='my_blob_container',
78-
account_name='my_storage_account',
79-
account_key='your storage account key',
80-
create_if_not_exists=True)
75+
blob_datastore_name='azblobsdk' # Name of the Datastore to workspace
76+
container_name=os.getenv("BLOB_CONTAINER", "<my-container-name>") # Name of Azure blob container
77+
account_name=os.getenv("BLOB_ACCOUNTNAME", "<my-account-name>") # Storage account name
78+
account_key=os.getenv("BLOB_ACCOUNT_KEY", "<my-account-key>") # Storage account key
79+
80+
blob_datastore = Datastore.register_azure_blob_container(workspace=ws,
81+
datastore_name=blob_datastore_name,
82+
container_name=container_name,
83+
account_name=account_name,
84+
account_key=account_key)
8185
```
8286

8387
+ For an **Azure File Share Datastore**, use [`register_azure_file_share()`](https://docs.microsoft.com/python/api/azureml-core/azureml.core.datastore(class)?view=azure-ml-py#register-azure-file-share-workspace--datastore-name--file-share-name--account-name--sas-token-none--account-key-none--protocol-none--endpoint-none--overwrite-false--create-if-not-exists-false--skip-validation-false-).
8488

8589
The following code creates and registers the datastore, `my_datastore`, to the workspace, `ws`. This datastore accesses the Azure file share, `my_file_share`, on the Azure storage account, `my_storage_account` using the provided account key.
8690

8791
```Python
88-
datastore = Datastore.register_azure_file_share(workspace=ws,
89-
datastore_name='my_datastore',
90-
file_share_name='my_file_share',
91-
account_name='my_storage account',
92-
account_key='your storage account key',
93-
create_if_not_exists=True)
92+
file_datastore_name='azfilesharesdk' # Name of the Datastore to workspace
93+
file_share_name=os.getenv("FILE_SHARE_CONTAINER", "<my-fileshare-name>") # Name of Azure file share container
94+
account_name=os.getenv("FILE_SHARE_ACCOUNTNAME", "<my-account-name>") # Storage account name
95+
account_key=os.getenv("FILE_SHARE_ACCOUNT_KEY", "<my-account-key>") # Storage Account Key
96+
97+
file_datastore = Datastore.register_azure_file_share(workspace=ws,
98+
datastore_name=file_datastore_name,
99+
file_share_name=file_share_name,
100+
account_name=account_name,
101+
account_key=account_key)
102+
```
103+
104+
+ For an **Azure SQL Datastore**, use [register_azure_sql_database()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.datastore.datastore?view=azure-ml-py#register-azure-sql-database-workspace--datastore-name--server-name--database-name--tenant-id-none--client-id-none--client-secret-none--resource-url-none--authority-url-none--endpoint-none--overwrite-false--username-none--password-none-) to register a credential Datastore connected to an Azure SQL database. with SQL authentication or service principal permissions.
105+
106+
#### By SQL authentication
107+
108+
```python
109+
sql_datastore_name="azsqlsdksql"
110+
server_name=os.getenv("SQL_SERVERNAME", "<my-server-name>") # Name of Azure SQL server
111+
database_name=os.getenv("SQL_DATBASENAME", "<my-database-name>") # Name of Azure SQL database
112+
username=os.getenv("SQL_USER_NAME", "<my-sql-user-name>") # The username of the database user to access the database.
113+
password=os.getenv("SQL_USER_PASSWORD", "<my-sql-user-password>") # The password of the database user to access the database.
114+
115+
sql_datastore = Datastore.register_azure_sql_database(workspace=ws,
116+
datastore_name=sql_datastore_name,
117+
server_name=server_name,
118+
database_name=database_name,
119+
username=username,
120+
password=password)
121+
122+
```
123+
124+
#### By service principal
125+
126+
```python
127+
sql_datastore_name="azsqlsdksp"
128+
server_name=os.getenv("SQL_SERVERNAME", "<my-server-name>") # Name of SQL server
129+
database_name=os.getenv("SQL_DATBASENAME", "<my-database-name>") # Name of SQL database
130+
client_id=os.getenv("SQL_CLIENTNAME", "<my-client-id>") # client id of service principal with permissions to access database
131+
client_secret=os.getenv("SQL_CLIENTSECRET", "<my-client-secret>") # the secret of service principal
132+
tenant_id=os.getenv("SQL_TENANTID", "<my-tenant-id>") # tenant id of service principal
133+
134+
sql_datastore = Datastore.register_azure_sql_database(workspace=ws,
135+
datastore_name=sql_datastore_name,
136+
server_name=server_name,
137+
database_name=database_name,
138+
client_id=client_id,
139+
client_secret=client_secret,
140+
tenant_id=tenant_id)
94141
```
95142

96143
#### Storage guidance
@@ -241,7 +288,7 @@ est = Estimator(source_directory='your code directory',
241288
entry_script='train.py',
242289
inputs=[datastore1.as_download(), datastore2.path('./foo').as_download(), datastore3.as_upload(path_on_compute='./bar.pkl')])
243290
```
244-
If you prefer to use a RunConfig object for training, you need to set up a [DataReference](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.data_reference.datareference?view=azure-ml-py) object.
291+
If you prefer to use a RunConfig object for training, you need to set up a [DataReference](https://docs.microsoft.com/python/api/azureml-core/azureml.data.data_reference.datareference?view=azure-ml-py) object.
245292

246293
The following code shows how to work with a DataReference object in an estimation pipeline. For the full example, see this [notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-how-to-use-estimatorstep.ipynb).
247294

0 commit comments

Comments
 (0)