11---
22subcategory : " Inference"
3- page_title : " Scaleway: scaleway_inference_deployment "
3+ page_title : " Scaleway: scaleway_inference_custom_model "
44---
55
66# Resource: scaleway_inference_custom_model
@@ -22,16 +22,16 @@ resource "scaleway_inference_custom_model" "test" {
2222### Deploy your own model on your managed inference
2323
2424``` terraform
25- resource "scaleway_inference_custom_model" "test " {
25+ resource "scaleway_inference_custom_model" "my_model " {
2626 name = "my-awesome-model"
2727 url = "https://huggingface.co/my-awsome-model"
2828 secret = "my-secret-token"
2929}
3030
31- resource "scaleway_inference_deployment" "main " {
31+ resource "scaleway_inference_deployment" "my_deployment " {
3232 name = "test-inference-deployment-basic"
3333 node_type = "A100-80GB" # replace with your node type
34- model_id = scaleway_inference_custom_model.test .id
34+ model_id = scaleway_inference_custom_model.my_model .id
3535
3636 public_endpoint {
3737 is_enabled = true
@@ -65,4 +65,12 @@ In addition to all arguments above, the following attributes are exported:
6565 - ` quantization ` - A list of supported quantization options, including:
6666 - ` quantization_bits ` - Number of bits used for quantization (e.g., 8, 16).
6767 - ` allowed ` - Whether this quantization is allowed.
68- - ` max_context_size ` - Maximum context length supported by this quantization.
68+ - ` max_context_size ` - Maximum context length supported by this quantization.
69+
70+ ## Import
71+
72+ Functions can be imported using, ` {region}/{id} ` , as shown below:
73+
74+ ``` bash
75+ terraform import scaleway_inference_custom_model.my_model fr-par/11111111-1111-1111-1111-111111111111
76+ ```
0 commit comments