You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/synapse-analytics/sql-data-warehouse/load-data-from-azure-blob-storage-using-copy.md
+28-45Lines changed: 28 additions & 45 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,8 @@ ms.date: 08/20/2024
8
8
ms.service: azure-synapse-analytics
9
9
ms.subservice: sql-dw
10
10
ms.topic: conceptual
11
-
ms.custom: azure-synapse
11
+
ms.custom:
12
+
- azure-synapse
12
13
---
13
14
# Tutorial: Load the New York Taxicab dataset
14
15
@@ -37,52 +38,52 @@ It's best to create a login and user that is dedicated for loading data. Then ad
37
38
38
39
Connect as the server admin so you can create logins and users. Use these steps to create a login and user called `LoaderRC20`. Then assign the user to the `staticrc20` resource class.
39
40
40
-
1. In SSMS, right-select **master** to show a drop-down menu, and choose **New Query**. A new query window opens.
41
+
1. In SSMS, right-select `master` to show a dropdown menu, and choose **New Query**. A new query window opens.
41
42
42
-
2. In the query window, enter these T-SQL commands to create a login and user named `LoaderRC20`, substituting your own strong password.
43
+
1. In the query window, enter these T-SQL commands to create a login and user named `LoaderRC20`, substituting your own strong password.
43
44
44
45
```sql
45
46
CREATE LOGIN LoaderRC20 WITH PASSWORD ='<strong password here>';
46
47
CREATEUSERLoaderRC20 FOR LOGIN LoaderRC20;
47
48
```
48
49
49
-
3. Select**Execute**.
50
+
1. Select**Execute**.
50
51
51
-
4. Right-click **mySampleDataWarehouse**, and choose **New Query**. A new query Window opens.
52
+
1. Right-click **mySampleDataWarehouse**, and choose **New Query**. A new query Window opens.
52
53
53
-
5. Enter the following T-SQL commands to create a database user named `LoaderRC20` for the `LoaderRC20` login. The second line grants the new user CONTROL permissions on the new data warehouse. These permissions are similar to making the user the owner of the database. The third line adds the new user as a member of the `staticrc20` [resource class](resource-classes-for-workload-management.md).
54
+
1. Enter the following T-SQL commands to create a database user named `LoaderRC20` for the `LoaderRC20` login. The second line grants the new user CONTROL permissions on the new data warehouse. These permissions are similar to making the user the owner of the database. The third line adds the new user as a member of the `staticrc20` [resource class](resource-classes-for-workload-management.md).
54
55
55
56
```sql
56
57
CREATE USER LoaderRC20 FOR LOGIN LoaderRC20;
57
58
GRANT CONTROL ON DATABASE::[mySampleDataWarehouse] to LoaderRC20;
58
59
EXEC sp_addrolemember 'staticrc20', 'LoaderRC20';
59
60
```
60
61
61
-
6. Select**Execute**.
62
+
1. Select**Execute**.
62
63
63
64
## Connect to the server as the loading user
64
65
65
66
The first step toward loading data is to login as`LoaderRC20`.
66
67
67
-
1. In Object Explorer, select the **Connect**drop down menu andselect**Database Engine**. The **Connect to Server** dialog box appears.
68
+
1. In Object Explorer, select the **Connect**dropdown menu andselect**Database Engine**. The **Connect to Server** dialog box appears.
68
69
69
-
2. Enter the fully qualified server name, and enter `LoaderRC20`as the Login. Enter your password for LoaderRC20.
70
+
1. Enter the fully qualified server name, and enter `LoaderRC20`as the Login. Enter your password for LoaderRC20.
70
71
71
-
3. Select**Connect**.
72
+
1. Select**Connect**.
72
73
73
-
4. When your connection is ready, you will see two server connections in Object Explorer. One connection as ServerAdmin and one connection as LoaderRC20.
74
+
1. When your connection is ready, you'll see two server connections in Object Explorer. One connection as ServerAdmin and one connection as LoaderRC20.
74
75
75
76
## Create tables for the sample data
76
77
77
-
You are ready to begin the process of loading data into your new data warehouse. This part of the tutorial shows you how to use the COPY statement to load the New York City taxi cab dataset from an Azure Storage blob. For future reference, to learn how to get your data to Azure Blob Storage or to load it directly from your source, see the [loading overview](design-elt-data-loading.md).
78
+
You're ready to begin the process of loading data into your new data warehouse. This part of the tutorial shows you how to use the COPY statement to load the New York City taxi cab dataset from an Azure Storage blob. For future reference, to learn how to get your data to Azure Blob Storage or to load it directly from your source, see the [loading overview](design-elt-data-loading.md).
78
79
79
80
Run the following SQL scripts and specify information about the data you wish to load. This information includes where the data is located, the format of the contents of the data, and the table definition for the data.
80
81
81
82
1. In the previous section, you logged into your data warehouse as`LoaderRC20`. In SSMS, right-click your LoaderRC20 connection andselect**New Query**. A new query window appears.
82
83
83
-
2. Compare your query window to the previous image. Verify your new query window is running as`LoaderRC20`and performing queries on your `MySampleDataWarehouse` database. Use this query window to perform all of the loading steps.
84
+
1. Compare your query window to the previous image. Verify your new query window is running as`LoaderRC20`and performing queries on your `MySampleDataWarehouse` database. Use this query window to perform all of the loading steps.
84
85
85
-
7. Run the following T-SQL statements to create the tables:
86
+
1. Run the following T-SQL statements to create the tables:
86
87
87
88
```sql
88
89
CREATE TABLE [dbo].[Date]
@@ -316,7 +317,7 @@ This section uses the [COPY statement to load](/sql/t-sql/statements/copy-into-t
2. View your data as it loads. You're loading several GBs of data and compressing it into highly performant clustered columnstore indexes. Run the following query that uses a dynamic management views (DMVs) to show the status of the load.
320
+
1. View your data as it loads. You're loading several GBs of data and compressing it into highly performant clustered columnstore indexes. Run the following query that uses a dynamic management views (DMVs) to show the status of the load.
320
321
321
322
```sql
322
323
SELECT r.[request_id]
@@ -342,56 +343,38 @@ This section uses the [COPY statement to load](/sql/t-sql/statements/copy-into-t
342
343
, r.command;
343
344
```
344
345
345
-
3. View all system queries.
346
+
1. View all system queries.
346
347
347
348
```sql
348
349
SELECT * FROM sys.dm_pdw_exec_requests;
349
350
```
350
351
351
-
4. Enjoy your data nicely loaded into your data warehouse.
352
+
1. Enjoy your data nicely loaded into your data warehouse.
352
353
353
354
354
355
## Clean up resources
355
356
356
357
You are being charged for compute resources and data that you loaded into your data warehouse. These are billed separately.
357
358
358
-
* If you want to keep the data in storage, you can pause compute when you aren't using the data warehouse. By pausing compute you will only be charge for data storage and you can resume the compute whenever you are ready to work with the data.
359
-
* If you want to remove future charges, you can delete the data warehouse.
359
+
- If you want to keep the data in storage, you can pause compute when you aren't using the data warehouse. By pausing compute, you will only be charge for data storage and you can resume the compute whenever you're ready to work with the data.
360
+
- If you want to remove future charges, you can delete the data warehouse.
360
361
361
362
Follow these steps to clean up resources as you desire.
362
363
363
-
1. Login to the [Azure portal](https://portal.azure.com), select your data warehouse.
364
+
1. Sign in to the [Azure portal](https://portal.azure.com), select your data warehouse.
364
365
365
-
2. To pause compute, select the **Pause** button. When the data warehouse is paused, you will see a **Start** button. To resume compute, select**Start**.
366
+
1. To pause compute, select the **Pause** button. When the data warehouse is paused, you see a **Start** button. To resume compute, select **Start**.
366
367
367
-
3. To remove the data warehouse so you won't be charged for compute or storage, select **Delete**.
368
+
1. To remove the data warehouse so you won't be charged for compute or storage, select**Delete**.
368
369
369
-
4. To remove the server you created, select **mynewserver-20180430.database.windows.net** in the previous image, and then select **Delete**. Be careful with this as deleting the server will delete all databases assigned to the server.
370
+
1. To remove the server you created, select**mynewserver-20180430.database.windows.net**in the previous image, and then select**Delete**. Be careful with this as deleting the server deletes all databases assigned to the server.
370
371
371
-
5. To remove the resource group, select **myResourceGroup**, and then select **Delete resource group**.
372
+
1. To remove the resource group, select**myResourceGroup**, and then select**Delete resource group**.
372
373
373
-
## Next steps
374
-
375
-
In this tutorial, you learned how to create a data warehouse and create a user for loading data. You used the simple [COPY statement](/sql/t-sql/statements/copy-into-transact-sql?view=azure-sqldw-latest&preserve-view=true#examples) to load data into your data warehouse.
376
-
377
-
You did these things:
378
-
> [!div class="checklist"]
379
-
>
380
-
> * Created a data warehouse in the Azure portal
381
-
> * Set up a server-level firewall rule in the Azure portal
382
-
> * Connected to the data warehouse with SSMS
383
-
> * Created a user designated for loading data
384
-
> * Created the tables for the sample data
385
-
> * Used the COPY T-SQL statement to load data into your data warehouse
386
-
> * Viewed the progress of data as it is loading
387
-
388
-
Advance to the development overview to learn how to migrate an existing database to Azure Synapse Analytics:
389
-
390
-
> [!div class="nextstepaction"]
391
-
> [Design decisions to migrate an existing database to Azure Synapse Analytics](sql-data-warehouse-overview-develop.md)
392
-
393
-
For more loading examples and references, view the following documentation:
This quickstart assumes that you already have a pre-existing instance of Azure Synapse Analytics.
13
+
This quickstart assumes that you already have a preexisting instance of Azure Synapse Analytics.
14
14
15
15
1. Search for Striim in the Azure Marketplace, and select the "Striim for Data Integration to Azure Synapse Analytics (Staged)" option.
16
16
@@ -40,7 +40,7 @@ This quickstart assumes that you already have a pre-existing instance of Azure S
40
40
41
41
1. Now, open your favorite browser and navigate to `<DNS Name>:9080`.
42
42
43
-
:::image type="content" source="media/striim-quickstart/navigate.png" alt-text="Screenshot from the Azure portal of the login screen.":::
43
+
:::image type="content" source="media/striim-quickstart/navigate.png" alt-text="Screenshot from the Azure portal of the sign in screen.":::
44
44
45
45
1. Sign in with the username and the password you set up in the Azure portal, and select your preferred wizard to get started, or go to the Apps page to start using the drag and drop UI.
0 commit comments