Skip to content

Commit cb245d9

Browse files
committed
Update Blog “learn-what-you-can-do-with-hpe-data-services-cloud-console-api-in-just-3-minutes”
1 parent 4dbfd8b commit cb245d9

File tree

1 file changed

+14
-4
lines changed

1 file changed

+14
-4
lines changed

content/blog/learn-what-you-can-do-with-hpe-data-services-cloud-console-api-in-just-3-minutes.md

Lines changed: 14 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,14 +7,14 @@ authorimage: https://gravatar.com/avatar/f66dd9562c53567466149af06ae9d4f1?s=96
77
disable: false
88
tags:
99
- hpe-greenlake
10-
- hpe-greenalke-cloud-platform
10+
- hpe-greenlake-cloud-platform
1111
- data-services-cloud-console
1212
---
1313
[HPE Data Services Cloud Console](https://developer.hpe.com/greenlake/data-services-cloud-console/home/), available through the [HPE GreenLake edge-to-cloud platform](https://developer.hpe.com/greenlake/hpe-greenlake-cloud-platform/home/), is a Software-as-a-service (SaaS) based cloud console application that delivers a suite of cloud data services that enable a unified data operations as a service for storage infrastructure, simplify storage and data management, and bring the cloud experience to wherever data lives. Data Services Cloud Console also offers a unified and fully programmable API that enables developers to automate data infrastructure management.
1414

1515
If you’re looking for a quick way to discover everything you can do with the HPE GreenLake Data Services Cloud Console API using popular tools that doesn’t require programming such as Postman, this blog post is definitely for you.
1616

17-
As you know, one of the benefits of working within a community is the ability to take advantage of open collaboration, sharing hints, tools, and resources. This is exactly what we are doing here. This post helps you get started with the Data Services Cloud Console API for HPE GreenLake for Block Storage cloud data service by taking advantage of the Postman collection contributed by one of our HPE Developer Community members.
17+
As you know, one of the benefits of working within a community is the ability to take advantage of open collaboration, sharing hints, tools, and resources. This is exactly what I am doing here. This post helps you get started with the Data Services Cloud Console API for HPE GreenLake for Block Storage cloud data service by taking advantage of the Postman collection contributed by one of our HPE Developer Community members.
1818

1919
> **Note:** This blog post assumes you have created an [HPE GreenLake account](https://console.greenlake.hpe.com/) and joined your account to your company account (also called an ***organization***). You also got assigned appropriate roles and permissions by the administrator for your organization in order to access HPE data services resources (for example storage arrays and volumes) through the Data Services Cloud Console application instances. A Data Service Cloud Console application instance is a service cluster running in one of the HPE regions.
2020
@@ -30,14 +30,17 @@ Currently, there are three HPE regional Data Services Cloud Console application
3030

3131
* EU Central
3232

33-
* https://eu1.data.cloud.hpe.com
33+
* https://eu1.data.cloud.hpe.com
34+
3435
* AP Northeast
3536

3637
* https://jp1.data.cloud.hpe.com
38+
3739
* US West
3840

3941
* https://us1.data.cloud.hpe.com
4042

43+
4144
HPE GreenLake Cloud Platform allows developers to make API calls on a particular regional Data Services Cloud Console customer instances. Using the API functionality in the HPE GreenLake Cloud Platform graphical user interface (GUI), developers can create their **API client application** credentials. The credentials consist of a *ClientID-ClientSecret* pair that represents the permissions granted to the user who creates the API client application credentials, to access the protected resources. The credentials is then used to generate and refresh expired OAuth based access token. Once the token is generated or refreshed, it can be used as **authorization bearer token** to make further secure REST API calls to HPE data services resources via the regional Data Services Cloud Console application instance.
4245

4346
You can refer to [this blog post](https://developer.hpe.com/blog/api-console-for-data-services-cloud-console/) to learn how to create API client application credentials for your specific regional Data Services Cloud Console application instance. Make sure to copy the *ClientID* and *ClientSecret* values to a safe location as you will need them to generate the access token via a REST API call as explained in the next sections.
@@ -73,18 +76,25 @@ The Data Services Cloud Console API collection built by Mark makes use of collec
7376
Define the Current Value of the collection variables to match your Data Services Cloud Console context:
7477

7578
* **baseUrl**: This variable defines the base URL of the REST API requests. It should match the regional endpoint of your Data Services Cloud Console application instance where your storage devices are registered.
79+
7680
* **ClientId** and **ClientSecret**: they should be set with the value of your Client Application API credentials you previously created using the HPE GreenLake Cloud Platform GUI. These variables are used to request an OAuth access token by authenticating with the authorization server referenced in the **sso_URI** variable.
81+
7782
* **sso_URI**: This variable is the URI of the OAuth authorization server. If your organization has set up their own HPE GreenLake SAML Single Sign-On (SSO) authorization server to create access token, replace the current default value with your SSO URI. Otherwise keep the value for this variable as currently set to *sso.common.cloud.hpe.com/as/token.oauth2*.
78-
* **BearerToken:** Do not edit this variable. Keep the value field empty. The collection variable BearerToken will be set automatically upon successful execution of the ***GetToken*** API call as explained in the next step.
83+
84+
* **BearerToken:** Do not edit this variable. Keep the value field empty. The collection variable BearerToken will be set automatically upon successful execution of the ***GetToken*** API call as explained in the next step.
85+
7986

8087
### Step 4 – Acquire an OAuth access token as your session bearer token
8188

8289
Data Services Cloud Console API uses bearer token as authorization type to ensure that all REST API requests access authorized data services securely. So you first need to obtain a token from the OAuth authorization server before you can make any REST API calls to your regional Data Services Cloud Console application instance. To do so, proceed as follows:
8390

8491
* From your collection, generate the token using the ***GetToken*** API call from the ***GetToken-Using-Variables*** folder.
92+
8593
* Verify you get a status code of 200 for a successful response with the token value in the response body.
94+
8695
* Check the token value has been automatically defined for the collection variable ***BearerToken***.
8796

97+
8898
The *GetToken* API call has defined a script in the ***Tests*** tab to programmatically set the collection variable BearerToken as shown in the picture below. The programmatically defined token is then used to authenticate any subsequent REST API calls.
8999

90100
![Defining collection variables programmatically in script](/img/tests-capturebearertoken-dscc-collection-postman-figure3.png "Defining collection variables programmatically in script")

0 commit comments

Comments
 (0)