Skip to content

Commit 83ac8e4

Browse files
committed
Update Blog “how-to-use-dscc-api-and-ansible-to-collect-the-storage-configuration”
1 parent ac1cc6e commit 83ac8e4

File tree

1 file changed

+10
-6
lines changed

1 file changed

+10
-6
lines changed

content/blog/how-to-use-dscc-api-and-ansible-to-collect-the-storage-configuration.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@ disable: false
88
tags:
99
- data-services-on-the-hpe-greenlake-platform
1010
---
11+
12+
1113
Capturing the current storage configuration to verify it against best practices or configuration rules is something that customer request regularly. If the customer uses Ansible as their automation platform, the [HPE 3PAR Ansible module](https://github.com/HewlettPackard/hpe3par_ansible_module?tab=readme-ov-file) can be used to create and delete hosts, volumes etc., but it is not really a solution for gathering the complete configuration.
1214

1315
Furthermore, this module uses the WSAPI of individual Alletra storage systems. The HPE Data Services Cloud Console (DSCC) would be the better option to collect storage configuration data of multiple systems, evn those that might be distributed across multiple sites. Through a single location, the DSCC would be able to the data of all storage systems.
@@ -28,11 +30,13 @@ In order to be independent of any Python library (or the lack of updates to a Py
2830

2931
## Retrieving a DSCC access token
3032

31-
The steps to first generate the client id and the client secret used to access the DSCC REST API was already described in a blog on the HPE Developer Portal: <!--StartFragment--> [Using HPE GreenLake Console's API Gateway for Data Services Cloud Console ](https://developer.hpe.com/blog/api-console-for-data-services-cloud-console/)<!--EndFragment-->.
33+
The steps to first generate the client id and the client secret used to access the DSCC REST API was already described in a post on the HPE Developer Community blog: [Using HPE GreenLake Console's API Gateway for Data Services Cloud Console ](https://developer.hpe.com/blog/api-console-for-data-services-cloud-console/).
3234

33-
Once you do have your client id and client secret, you can generate an access token, that is valid for two hours. This access token will allow you to issue REST API calls to the Data Services Cloud Console - you will have the empowerment on the REST API as the user that is linked with the client id and secret used to create the access token. Hence, it is best practice to store the client id and secret in a secure place. The below code example had the client credentials stored in the credentials.yml file, that was encrypted using ansible-vault. The current Ansible playbook stores the access token in a file that grants access only to the current user (hence, the access mode 600 for this file) to avoid misuse of the retrieved access token.
35+
Once you have your client id and client secret, you can generate an access token that is valid for two hours. This access token will allow you to issue REST API calls to the Data Services Cloud Console, as it identifies you as the user that is linked with the client ID and secret to create the access token. Hence, it is best practice to store the client id and secret in a secure place.
3436

35-
```
37+
The below code example had the client credentials stored in the credentials.yml file, that was encrypted using ansible-vault. The current Ansible playbook stores the access token in a file that grants access only to the current user (hence, the access mode 600 for this file) to avoid misuse of the retrieved access token.
38+
39+
```yaml
3640
- name: Include encrypted vars
3741
include_vars: credentials.yml
3842

@@ -60,15 +64,15 @@ Once you do have your client id and client secret, you can generate an access to
6064
6165
## DSCC REST API call
6266
63-
A DSCC REST API call can be with and without a request body and can have multiple responses depending on the actual API call. Nevertheless, it is good practice to build a modular code approach that uses a generalized REST API call to access the Data Services Cloud Console. The generalized DSCC REST API call is has as parameters:
67+
A DSCC REST API call can be with and without a request body and can have multiple responses depending on the actual API call. Nevertheless, it is good practice to build a modular code approach that uses a generalized REST API call to access the Data Services Cloud Console. The generalized DSCC REST API call has its parameters:
6468
65-
* requestUri (as mentioned in the <!--StartFragment-->[](https://developer.hpe.com/greenlake/data-services-on-the-hpe-greenlake-platform/home/)[Data Services REST API](https://console-us1.data.cloud.hpe.com/doc/api/v1/)[](https://console-us1.data.cloud.hpe.com/doc/api/v1/)<!--EndFragment-->)
69+
* requestUri (as mentioned in the [](https://developer.hpe.com/greenlake/data-services-on-the-hpe-greenlake-platform/home/)[Data Services REST API](https://console-us1.data.cloud.hpe.com/doc/api/v1/))
6670
* request method (GET, POST, DELETE, PUT)
6771
* request body (optional)
6872
6973
and is shown in the following code block:
7074
71-
```
75+
```yaml
7276
- name: Include encrypted vars
7377
include_vars: vars/credentials.yml
7478

0 commit comments

Comments
 (0)