You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* adds dataset_size_in_gb attribute
* increment Go API version
* addresses PR comments
* adds dataset_size_in_gb attribute
* addresses PR comments
* fix issue with update
* fix UT and add to docs
*`subscription_id`: (Required) The ID of the Active-Active subscription to create the database in. **Modifying this attribute will force creation of a new resource.**
92
92
*`name` - (Required) A meaningful name to identify the database. **Modifying this attribute will force creation of a new resource.**
93
-
*`memory_limit_in_gb` - (Required) Maximum memory usage for this specific database, including replication and other overhead
93
+
*`memory_limit_in_gb` - (Optional - **Required if `dataset_size_in_gb` is unset**) Maximum memory usage for this specific database, including replication and other overhead **Deprecated in favor of `dataset_size_in_gb` - not possible to import databases with this attribute set**
94
+
*`dataset_size_in_gb` - (Optional - **Required if `memory_limit_in_gb` is unset**) The maximum amount of data in the dataset for this specific database is in GB
Note: Due to constraints in the Redis Cloud API, the import process will not import global attributes or override region attributes. If you wish to use these attributes in your Terraform configuration, you will need to manually add them to your Terraform configuration and run `terraform apply` to update the database.
180
181
182
+
Additionally, the `memory_limit_in_gb` cannot be set during imports as it is deprecated. If you need to set the `memory_limit_in_gb` attribute, you will need to create a new database resource. It is recommended to use the `dataset_size_in_gb` attribute instead.
@@ -98,7 +98,8 @@ The `cloud_provider` block supports:
98
98
99
99
The `creation_plan` block supports:
100
100
101
-
*`memory_limit_in_gb` - (Required) Maximum memory usage that will be used for your largest planned database.
101
+
*`memory_limit_in_gb` - (Required) Maximum memory usage that will be used for your largest planned database. You can not set both dataset_size_in_gb and memory_limit_in_gb. **Deprecated: Use `dataset_size_in_gb` instead**
102
+
*`dataset_size_in_gb` - (Required) The maximum amount of data in the dataset for this specific database is in GB. You can not set both dataset_size_in_gb and memory_limit_in_gb.
102
103
*`modules` - (Optional) a list of modules that will be used by the databases in this subscription. Not currently compatible with ‘ram-and-flash’ memory storage.
@@ -82,7 +82,8 @@ The following arguments are supported:
82
82
*`name` - (Required) A meaningful name to identify the database
83
83
*`throughput_measurement_by` - (Required) Throughput measurement method that will be used by your databases. Either `number-of-shards` or `operations-per-second`. **`number-of-shards` is deprecated and only supported for legacy deployments.**
84
84
*`throughput_measurement_value` - (Required) Throughput value (as applies to selected measurement method)
85
-
*`memory_limit_in_gb` - (Required) Maximum memory usage for this specific database
85
+
*`memory_limit_in_gb` - (Optional - **Required if `dataset_size_in_gb` is unset**) Maximum memory usage for this specific database, including replication and other overhead **Deprecated in favor of `dataset_size_in_gb` - not possible to import databases with this attribute set**
86
+
*`dataset_size_in_gb` - (Optional - **Required if `memory_limit_in_gb` is unset**) The maximum amount of data in the dataset for this specific database is in GB
86
87
*`protocol` - (Optional) The protocol that will be used to access the database, (either ‘redis’ or ‘memcached’) Default: ‘redis’. **Modifying this attribute will force creation of a new resource.**
Note: Due to constraints in the Redis Cloud API, the `memory_limit_in_gb` cannot be set during imports as it is deprecated. If you need to set the `memory_limit_in_gb` attribute, you will need to create a new database resource. It is recommended to use the `dataset_size_in_gb` attribute instead.
0 commit comments