You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/storage/blobs/storage-quickstart-blobs-java-quarkus.md
+18-17Lines changed: 18 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,11 @@
1
1
---
2
-
title: "Quickstart: Quarkus extension for Azure Blob Storage"
2
+
title: "Quickstart: Quarkus Extension for Azure Blob Storage"
3
3
description: In this quickstart, you learn how to use the Quarkus extension for Azure Blob Storage to create a container and a blob in Blob (object) storage. Next, you learn how to download the blob to your local computer, and how to list all of the blobs in a container.
@@ -49,7 +50,7 @@ Application requests to Azure Blob Storage must be authorized. Using `DefaultAzu
49
50
50
51
The order and locations in which `DefaultAzureCredential` looks for credentials can be found in the [Azure Identity library overview](/java/api/overview/azure/identity-readme#defaultazurecredential).
51
52
52
-
In this quickstart, your app authenticates using your Azure CLI sign-in credentials when running locally. Once it's deployed to Azure, your app can then use a [managed identity](../../active-directory/managed-identities-azure-resources/overview.md). This transition between environments doesn't require any code changes.
53
+
In this quickstart, your app authenticates using your Azure CLI sign-in credentials when running locally. After it's deployed to Azure, your app can then use a [managed identity](../../active-directory/managed-identities-azure-resources/overview.md). This transition between environments doesn't require any code changes.
53
54
54
55
### Assign roles to your Microsoft Entra user account
55
56
@@ -65,18 +66,18 @@ You can authorize access to data in your storage account using the following ste
65
66
az login
66
67
```
67
68
68
-
2. Make sure you provide the endpoint of your Azure Blob Storage account. The following example shows how to set the endpoint using the environment variable `QUARKUS_AZURE_STORAGE_BLOB_ENDPOINT` via the Azure CLI. Replace `<RESOURCE_GROUP_NAME>` and `<STORAGE_ACCOUNT_NAME>` with your resource group and storage account names before running the command:
69
+
2. Make sure you provide the endpoint of your Azure Blob Storage account. The following example shows how to set the endpoint using the environment variable `QUARKUS_AZURE_STORAGE_BLOB_ENDPOINT` via the Azure CLI. Replace `<resource-group-name>` and `<storage-account-name>` with your resource group and storage account names before running the command:
69
70
70
71
```azurecli
71
72
export QUARKUS_AZURE_STORAGE_BLOB_ENDPOINT=$(az storage account show \
72
-
--resource-group <RESOURCE_GROUP_NAME> \
73
-
--name <STORAGE_ACCOUNT_NAME> \
73
+
--resource-group <resource-group-name> \
74
+
--name <storage-account-name> \
74
75
--query 'primaryEndpoints.blob' \
75
76
--output tsv)
76
77
```
77
78
78
79
> [!NOTE]
79
-
> When deployed to Azure, you'll need to enable managed identity on your app, and configure your storage account to allow that managed identity to connect. For detailed instructions on configuring this connection between Azure services, see the [Auth from Azure-hosted apps](/azure/developer/java/sdk/identity-azure-hosted-auth) tutorial.
80
+
> When deployed to Azure, you need to enable managed identity on your app, and configure your storage account to allow that managed identity to connect. For more information on configuring this connection between Azure services, see [Authenticate Azure-hosted Java applications](/azure/developer/java/sdk/identity-azure-hosted-auth).
80
81
81
82
## Run the sample
82
83
@@ -88,9 +89,9 @@ The code example performs the following actions:
88
89
- Lists the blobs in the container.
89
90
- Downloads the blob data to the local file system.
90
91
- Deletes the blob and container resources created by the app.
91
-
-Deleting the local source and downloaded files.
92
+
-Deletes the local source and downloaded files.
92
93
93
-
Run the application in JVM mode using the following command:
94
+
Run the application in JVM mode by using the following command:
94
95
95
96
```bash
96
97
mvn package
@@ -118,7 +119,7 @@ Done
118
119
119
120
Before you begin the cleanup process, check your data folder for the two files. You can compare them and observe that they're identical.
120
121
121
-
Optionally, you can run the sample in native mode. To do this, you need to have GraalVM installed, or use a builder image to build the native executable. For more information, see the [Building a Native Executable](https://quarkus.io/guides/building-native-image). This quickstart uses Docker as container runtime to build a Linux native executable. If you haven't installed Docker, you can download it from the [Docker website](https://www.docker.com/products/docker-desktop).
122
+
Optionally, you can run the sample in native mode. To do this, you need to have GraalVM installed, or use a builder image to build the native executable. For more information, see [Building a Native Executable](https://quarkus.io/guides/building-native-image). This quickstart uses Docker as container runtime to build a Linux native executable. If you haven't installed Docker, you can download it from the [Docker website](https://www.docker.com/products/docker-desktop).
122
123
123
124
Run the following command to build and execute the native executable in a Linux environment:
124
125
@@ -135,7 +136,7 @@ Next, you walk through the sample code to understand how it works.
135
136
136
137
Working with any Azure resource using the SDK begins with creating a client object. The Quarkus extension for Azure Blob Storage automatically injects a client object with authorized access using `DefaultAzureCredential`.
137
138
138
-
To successfully inject a client object, first you need to add the extensions `quarkus-arc` and `quarkus-azure-storage-blob` to your `pom.xml` file as a dependencies:
139
+
To successfully inject a client object, first you need to add the extensions `quarkus-arc` and `quarkus-azure-storage-blob` to your **pom.xml** file as a dependencies:
139
140
140
141
```xml
141
142
<properties>
@@ -183,14 +184,14 @@ Next, you can inject the client object into your application code using the `@In
183
184
BlobServiceClient blobServiceClient;
184
185
```
185
186
186
-
This is all you need to code to get a client object using the Quarkus extension for Azure Blob Storage. To make sure the client object is authorized to access your storage account at runtime, you need to follow steps in the previous section [Authenticate to Azure and authorize access to blob data](#authenticate-to-azure-and-authorize-access-to-blob-data) before running the application.
187
+
That's all you need to code to get a client object using the Quarkus extension for Azure Blob Storage. To make sure the client object is authorized to access your storage account at runtime, you need to follow steps in the previous section [Authenticate to Azure and authorize access to blob data](#authenticate-to-azure-and-authorize-access-to-blob-data) before running the application.
187
188
188
189
### Manage blobs and containers
189
190
190
-
The following code snippet shows how to create a container, upload a blob, list blobs in a container, and download a blob.
191
+
The following code example shows how to create a container, upload a blob, list blobs in a container, and download a blob.
191
192
192
193
> [!NOTE]
193
-
> Writing to the local filesystem is considered a bad practice in cloud native applications. However, the sample uses the local filesystem to illustrate the use of blob storage in a way that is easy to for the user to verify. If taking an application to production, review your storage options and choose the best option for your needs. See[Review your storage options](/azure/architecture/guide/technology-choices/storage-options).
194
+
> Writing to the local filesystem is considered a bad practice in cloud native applications. However, the example uses the local filesystem to illustrate the use of blob storage in a way that is easy to for the user to verify. When you take an application to production, review your storage options and choose the best option for your needs. For more information, see[Review your storage options](/azure/architecture/guide/technology-choices/storage-options).
194
195
195
196
```java
196
197
// Create a unique name for the container
@@ -258,7 +259,7 @@ downloadedFile.delete();
258
259
System.out.println("Done");
259
260
```
260
261
261
-
These operations are similar to the [Quickstart: Azure Blob Storage client library for Java](storage-quickstart-blobs-java.md). For more detailed code explanations, see the following sections in that quickstart:
262
+
These operations are similar to the ones described in [Quickstart: Azure Blob Storage client library for Java SE](storage-quickstart-blobs-java.md). For more detailed code explanations, see the following sections in that quickstart:
262
263
263
264
-[Create a container](storage-quickstart-blobs-java.md#create-a-container)
264
265
-[Upload blobs to a container](storage-quickstart-blobs-java.md#upload-blobs-to-a-container)
Copy file name to clipboardExpand all lines: includes/assign-roles.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.author: alexwolf
10
10
ms.custom: include file
11
11
---
12
12
13
-
When developing locally, make sure that the user account that is accessing blob data has the correct permissions. You'll need **Storage Blob Data Contributor** to read and write blob data. To assign yourself this role, you'll need to be assigned the **User Access Administrator** role, or another role that includes the **Microsoft.Authorization/roleAssignments/write** action. You can assign Azure RBAC roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. You can learn more about the Storage Blob Data Contributor role in[Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles/storage#storage-blob-data-contributor). You can learn more about the available scopes for role assignments on the [scope overview](../articles/role-based-access-control/scope-overview.md) page.
13
+
When developing locally, make sure that the user account that is accessing blob data has the correct permissions. You'll need **Storage Blob Data Contributor** to read and write blob data. To assign yourself this role, you'll need to be assigned the **User Access Administrator** role, or another role that includes the **Microsoft.Authorization/roleAssignments/write** action. You can assign Azure RBAC roles to a user using the Azure portal, Azure CLI, or Azure PowerShell. For more information about the **Storage Blob Data Contributor** role, see[Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles/storage#storage-blob-data-contributor). For more information about the available scopes for role assignments, see [Understand scope for Azure RBAC](../articles/role-based-access-control/scope-overview.md).
14
14
15
15
In this scenario, you'll assign permissions to your user account, scoped to the storage account, to follow the [Principle of Least Privilege](../articles/active-directory/develop/secure-least-privileged-access.md). This practice gives users only the minimum permissions needed and creates more secure production environments.
0 commit comments