You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
LocalAI spawns child processes to run model backends (e.g., llama.cpp, diffusers, whisper). To properly stop these processes and free resources like VRAM, LocalAI needs permission to send signals to its child processes.
36
+
37
+
If you're using restrictive security contexts, ensure the `CAP_KILL` capability is available:
38
+
39
+
```yaml
40
+
apiVersion: v1
41
+
kind: Pod
42
+
metadata:
43
+
name: local-ai
44
+
spec:
45
+
containers:
46
+
- name: local-ai
47
+
image: quay.io/go-skynet/local-ai:latest
48
+
securityContext:
49
+
allowPrivilegeEscalation: false
50
+
capabilities:
51
+
drop:
52
+
- ALL
53
+
add:
54
+
- KILL # Required for LocalAI to stop backend processes
55
+
seccompProfile:
56
+
type: RuntimeDefault
57
+
runAsNonRoot: true
58
+
runAsUser: 1000
59
+
```
60
+
61
+
Without the `KILL` capability, LocalAI cannot terminate backend processes when models are stopped, leading to:
4. If running in privileged mode works but the above doesn't, check your cluster's Pod Security Policies or Pod Security Standards. You may need to adjust cluster-level policies to allow the `KILL` capability.
106
+
107
+
5. Ensure your seccomp profile (if custom) allows the `kill` syscall. The `RuntimeDefault` profile typically includes this.
LocalAI spawns child processes to run model backends (e.g., llama.cpp, diffusers, whisper). To properly stop these processes and free resources like VRAM, LocalAI needs permission to send signals to its child processes.
36
+
37
+
If you're using restrictive security contexts, ensure the `CAP_KILL` capability is available:
38
+
39
+
```yaml
40
+
apiVersion: v1
41
+
kind: Pod
42
+
metadata:
43
+
name: local-ai
44
+
spec:
45
+
containers:
46
+
- name: local-ai
47
+
image: quay.io/go-skynet/local-ai:latest
48
+
securityContext:
49
+
allowPrivilegeEscalation: false
50
+
capabilities:
51
+
drop:
52
+
- ALL
53
+
add:
54
+
- KILL # Required for LocalAI to stop backend processes
55
+
seccompProfile:
56
+
type: RuntimeDefault
57
+
runAsNonRoot: true
58
+
runAsUser: 1000
59
+
```
60
+
61
+
Without the `KILL` capability, LocalAI cannot terminate backend processes when models are stopped, leading to:
4. If running in privileged mode works but the above doesn't, check your cluster's Pod Security Policies or Pod Security Standards. You may need to adjust cluster-level policies to allow the `KILL` capability.
106
+
107
+
5. Ensure your seccomp profile (if custom) allows the `kill` syscall. The `RuntimeDefault` profile typically includes this.
0 commit comments