You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/blog/learn-what-you-can-do-with-hpe-data-services-cloud-console-api-in-just-3-minutes.md
+17-17Lines changed: 17 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,11 +20,11 @@ As you know, one of the benefits of working within a community is the ability to
20
20
21
21
### Data Services Cloud Console and REST API
22
22
23
-
Data Services Cloud Console supports a set of REST APIs that allows users to integrate HPE Data Services Cloud Console with their custom applications. By using [OAuth 2.0 protocol](https://oauth.net/2/) to authenticate and authorize applications, secure and time-limited (***120 minutes***) access to HPE data services are provided via an **access token**. The token ensures that client API requests access HPE data services for the requested operation, securely and according to the authorization granted to the user who created it.
23
+
Data Services Cloud Console supports a set of REST APIs that allows users to integrate HPE Data Services Cloud Console with their custom applications. By using [OAuth 2.0 protocol](https://oauth.net/2/) to authenticate and authorize applications, secure and time-limited (***120 minutes***) access to HPE data services are provided via an **access token**. The token ensures that client API requests access HPE data services for the requested operation securely and according to the authorization granted to the user who created it.
24
24
25
-
> **Note:** You can find the Data Services Cloud Console API Documentation[here](https://console-us1.data.cloud.hpe.com/doc/api/v1/) and in the help section of the [HPE GreenLake Cloud Platform](https://console.greenlake.hpe.com/).
25
+
> **Note:** You can find the Data Services Cloud Console API documentation[here](https://console-us1.data.cloud.hpe.com/doc/api/v1/) and in the help section of the [HPE GreenLake Cloud Platform](https://console.greenlake.hpe.com/).
26
26
27
-
The REST APIs support standard HTTP request methods (GET, POST, PATCH, PUT and DELETE). A HTTP request is made by providing a specific HPE regional connectivity endpoint for the Data Service Cloud Service application instance, HTTP request method, access token and data payload. The HTTP response for these requests are returned in the JSON format.
27
+
The REST APIs support standard HTTP request methods (GET, POST, PATCH, PUT and DELETE). A HTTP request is made by providing a specific HPE regional connectivity endpoint for the Data Service Cloud Service application instance, HTTP request method, access token and data payload. The HTTP response for these requests are returned in a JSON format.
28
28
29
29
Currently, there are three HPE regional Data Services Cloud Console application instance endpoints:
30
30
@@ -41,25 +41,25 @@ Currently, there are three HPE regional Data Services Cloud Console application
41
41
*https://us1.data.cloud.hpe.com
42
42
43
43
44
-
HPE GreenLake Cloud Platform allows developers to make API calls on a particular regional Data Services Cloud Console customer instances. Using the API functionality in the HPE GreenLake Cloud Platform graphical user interface (GUI), developers can create their **API client application** credentials. The credentials consist of a *ClientID-ClientSecret* pair that represents the permissions granted to the user who creates the API client application credentials, to access the protected resources. The credentials is then used to generate and refresh expired OAuth based access token. Once the token is generated or refreshed, it can be used as **authorization bearer token** to make further secure REST API calls to HPE data services resources via the regional Data Services Cloud Console application instance.
44
+
HPE GreenLake Cloud Platform allows developers to make API calls on a particular regional Data Services Cloud Console customer instance. Using the API functionality in the HPE GreenLake Cloud Platform graphical user interface (GUI), developers can create their **API client application** credentials. The credentials consist of a *ClientID-ClientSecret* pair that represents the permissions granted to the user who creates the API client application credentials. The credentials are then used to generate and refresh expired OAuth based access token. Once the token is generated or refreshed, it can be used as an **authorization bearer token** to make further secure REST API calls to protected HPE data services resources via the regional Data Services Cloud Console application instance.
45
45
46
-
You can refer to [this blog post](https://developer.hpe.com/blog/api-console-for-data-services-cloud-console/) to learn how to create API client application credentials for your specific regional Data Services Cloud Console application instance. Make sure to copy the *ClientID* and *ClientSecret* values to a safe location as you will need them to generate the access token via a REST API call as explained in the next sections.
46
+
You can refer to [this blog post](https://developer.hpe.com/blog/api-console-for-data-services-cloud-console/) to learn how to create API client application credentials for your specific regional Data Services Cloud Console application instance. Make sure to copy the *ClientID* and *ClientSecret* values to a safe location as you will need them to generate the access token via a REST API call, which is necessary to complete the following steps.
47
47
48
48
**Ready? Let’s get started!**
49
49
50
50
### Step 1 – Sign in to your Postman account
51
51
52
-
You can sign in to your Postman account either from the [web app](https://identity.getpostman.com/login) or from the desktop app. If you don’t have a Postman account already, you can sign up for a Postman account [here](https://identity.getpostman.com/signup)and download the desktop app [here](https://www.postman.com/downloads/).
52
+
You can sign in to your Postman account either from the [web app](https://identity.getpostman.com/login) or from the desktop app. If you don’t have a Postman account already, you can sign up for a Postman account [here](https://identity.getpostman.com/signup)or download the desktop app [here](https://www.postman.com/downloads/).
53
53
54
54
### Step 2 – Copy the existing HPE GreenLake Data Services Cloud Console API public collection
55
55
56
-
Upon log in to your Postman account, from the ***Search*** bar, look for the public collection "**Data Services Cloud Console API**" and select the public collection from one of our community contributor, [Mark van Silfhout](mailto:[email protected]) as shown below:
56
+
Upon log in to your Postman account, from the ***Search*** bar, look for the public collection "**Data Services Cloud Console API**" and select the public collection from our community contributor, [Mark van Silfhout](mailto:[email protected]), as shown below:
57
57
58
58

59
59
60
60
<spanstyle="color:grey; font-family:Arial; font-size:1em">Figure 1: The Data Services Cloud Console API public collection.</span>
61
61
62
-
You can then fork the collection to make a copy of it in your Postman workspace. You can then work with your own copy of the collection and perform changes without affecting the parent collection. Select the ***more actions*** icon (the ellipsis) next to the collection, then select ***Create a fork***. When you fork the public collection, you can choose to watch the original collection to be notified about changes made to the parent collection. This allows you to pull updates from the parent collection into your forked copy, should the parent collection be updated.
62
+
Fork the collection to make a copy of it in your Postman workspace. This allows you to work with your own copy of the collection and perform changes without affecting the parent collection. Select the ***more actions*** icon (the ellipsis) next to the collection, then select ***Create a fork***. When you fork the public collection, you can choose to watch the original collection to be notified about changes made to the parent collection. This allows you to pull updates from the parent collection into your forked copy, should the parent collection be updated.
63
63
64
64
Alternatively, you can export the public collection locally to your local storage as JSON file and import it as a new collection in your Postman workspace.
65
65
@@ -73,26 +73,26 @@ The Data Services Cloud Console API collection built by Mark makes use of collec
Define the Current Value of the collection variables to match your Data Services Cloud Console context:
76
+
Define the _current value_ of the collection variables to match your Data Services Cloud Console context:
77
77
78
78
***baseUrl**: This variable defines the base URL of the REST API requests. It should match the regional endpoint of your Data Services Cloud Console application instance where your storage devices are registered.
79
79
80
-
***ClientId** and **ClientSecret**: they should be set with the value of your Client Application API credentials you previously created using the HPE GreenLake Cloud Platform GUI. These variables are used to request an OAuth access token by authenticating with the authorization server referenced in the **sso_URI** variable.
80
+
***ClientId** and **ClientSecret**: these should be set with the value of your Client Application API credentials you previously created using the HPE GreenLake Cloud Platform GUI. These variables are used to request an OAuth access token by authenticating with the authorization server referenced in the **sso_URI** variable.
81
81
82
-
***sso_URI**: This variable is the URI of the OAuth authorization server. If your organization has set up their own HPE GreenLake SAML Single Sign-On (SSO) authorization server to create access token, replace the current default value with your SSO URI. Otherwise keep the value for this variable as currently set to *sso.common.cloud.hpe.com/as/token.oauth2*.
82
+
***sso_URI**: This variable is the URI of the OAuth authorization server. If your organization has set up their own HPE GreenLake SAML Single Sign-On (SSO) authorization server to create an access token, replace the current default value with your SSO URI. Otherwise keep the value for this variable as currently set to *sso.common.cloud.hpe.com/as/token.oauth2*.
83
83
84
84
***BearerToken:** Do not edit this variable. Keep the value field empty. The collection variable BearerToken will be set automatically upon successful execution of the ***GetToken*** API call as explained in the next step.
85
85
86
86
87
87
### Step 4 – Acquire an OAuth access token as your session bearer token
88
88
89
-
Data Services Cloud Console API uses bearer token as authorization type to ensure that all REST API requests access authorized data services securely. So you first need to obtain a token from the OAuth authorization server before you can make any REST API calls to your regional Data Services Cloud Console application instance. To do so, proceed as follows:
89
+
Data Services Cloud Console API uses a bearer token as an authorization type to ensure that all REST API requests access authorized data services securely. So you first need to obtain a token from the OAuth authorization server before you can make any REST API calls to your regional Data Services Cloud Console application instance. To do so, proceed as follows:
90
90
91
91
* From your collection, generate the token using the ***GetToken*** API call from the ***GetToken-Using-Variables*** folder.
92
92
93
93
* Verify you get a status code of 200 for a successful response with the token value in the response body.
94
94
95
-
* Check the token value has been automatically defined for the collection variable ***BearerToken***.
95
+
* Check that the token value has been automatically defined for the collection variable ***BearerToken***.
96
96
97
97
98
98
The *GetToken* API call has defined a script in the ***Tests*** tab to programmatically set the collection variable BearerToken as shown in the picture below. The programmatically defined token is then used to authenticate any subsequent REST API calls.
@@ -105,7 +105,7 @@ The *GetToken* API call has defined a script in the ***Tests*** tab to programma
105
105
106
106
### Step 5 – Make subsequent secure REST API calls
107
107
108
-
The client REST API requests are authenticated by presenting the access token as authorization bearer token to the regional Data Services Cloud Console application instance. The instance validates the access token, and if valid, serves the request.
108
+
The client REST API requests are authenticated by presenting the access token as the authorization bearer token to the regional Data Services Cloud Console application instance. The instance validates the access token, and if valid, serves the request.
109
109
110
110
Pick one REST API call from the ***storage-systems*** folder to ***Get all storage systems*** registered with your regional Data Services Cloud Console application instance.
111
111
@@ -125,7 +125,7 @@ As depicted in the figure below, the API supports several query parameters (clic
125
125
126
126
<spanstyle="color:grey; font-family:Arial; font-size:1em">Figure 6: REST API request with query parameter to search for storage array of type HPE Alletra 9060.</span>
127
127
128
-
The query parameters are indicated after the question mark (“?”) in the REST API URL. Select and adjust the query parameters according to your environment. Make sure to refer to [the API documentation](https://console-us1.data.cloud.hpe.com/doc/api/v1/) to understand the query parameters that can be used for each HPE Data Services Cloud Console API requests.
128
+
The query parameters are indicated after the question mark (“?”) in the REST API URL. Select and adjust the query parameters according to your environment. Make sure to refer to [the API documentation](https://console-us1.data.cloud.hpe.com/doc/api/v1/) to understand the query parameters that can be used for each HPE Data Services Cloud Console API request.
129
129
130
130
Finally click the **Send** button. You will get a JSON representation of the storage system resources registered based on the query parameters specified. Here, I am getting the list of storage systems of type “*HPE Alletra 9060*”.
131
131
@@ -194,8 +194,8 @@ Finally click the **Send** button. You will get a JSON representation of the sto
194
194
195
195
## Summary
196
196
197
-
This blog gives you a great example on how to obtain the access token using API call and help you get started with the Data Services Cloud Console REST API for the *HPE GreenLake for Block Storage* data service using Postman. Additional HPE data services APIs will be published in accordance to the expansion of cloud data services delivered through the HPE Data Services Cloud Console. So make sure to stay tuned for any update of Mark’s public Postman collection for Data Services Cloud Console API.
197
+
This blog gives you a great example on how to obtain the access token using an API call and should help you get started with the Data Services Cloud Console REST API for the *HPE GreenLake for Block Storage* data service using Postman. Additional HPE data services APIs will be published in accordance to the expansion of cloud data services delivered through the HPE Data Services Cloud Console. Make sure you stay tuned for any update of Mark’s public Postman collection for Data Services Cloud Console API.
198
198
199
-
If you have more time, I invite you to explore further the rest of the collection on your own while adjusting the query parameters to match your Data Services Cloud Console context. Also, don’t hesitate to provide Mark with feedback on his very convenient collection.
199
+
If you have more time, I invite you to further explore the rest of the collection on your own, while adjusting the query parameters to match your Data Services Cloud Console context. Also, don’t hesitate to provide Mark with feedback on his very convenient collection.
200
200
201
201
Any questions on HPE GreenLake Data Services Cloud Console API? Please join the [HPE Developer Slack Workspace](https://slack.hpedev.io/) and start a discussion in our [\#hpe-greenlake-data-services-cloud-console](https://hpedev.slack.com/archives/C02D6H623JP) channel.
0 commit comments