Skip to content

Commit ad5d4ca

Browse files
authored
Merge pull request #280 from akawashiro/misc-fixes
Stop -it to run by systemd
2 parents 7215837 + b944963 commit ad5d4ca

File tree

3 files changed

+43
-1
lines changed

3 files changed

+43
-1
lines changed

README.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,27 @@ For example,
171171
$ jendeley launch --db <YOUR PDFs DIR>/jendeley_db.json --experimental_use_ollama_server
172172
```
173173

174+
To run the LLM server automatically, you can use the following `systemd` service file.
175+
```console
176+
$ cat ~/.config/systemd/user/ollama-jendeley.service
177+
# jendeley.service
178+
[Unit]
179+
Description=jendeley JSON-based document organization software
180+
181+
[Service]
182+
ExecStart=<PATH_TO_NODE>/node/v18.16.0/lib/node_modules/@a_kawashiro/jendeley/run_ollama.sh
183+
184+
[Install]
185+
WantedBy=default.target
186+
$ systemctl --user enable ollama-jendeley
187+
$ systemctl --user start ollama-jendeley
188+
```
189+
190+
To check the LLM server's status, you can use the following command.
191+
```console
192+
$ journalctl --user -f -u ollama-jendeley.service
193+
```
194+
174195
## Contact me
175196
You can find me on Twitter at [https://twitter.com/a_kawashiro](https://twitter.com/a_kawashiro) and on Mastodon at [https://mstdn.jp/@a_kawashiro](https://mstdn.jp/@a_kawashiro). Additional contact information can be found on my website at [https://akawashiro.github.io/#links](https://akawashiro.github.io/#links). Also, feel free to create an issue or submit a pull request on [the repository](https://github.com/akawashiro/jendeley).
176197

jendeley-backend/README.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,27 @@ For example,
171171
$ jendeley launch --db <YOUR PDFs DIR>/jendeley_db.json --experimental_use_ollama_server
172172
```
173173

174+
To run the LLM server automatically, you can use the following `systemd` service file.
175+
```console
176+
$ cat ~/.config/systemd/user/ollama-jendeley.service
177+
# jendeley.service
178+
[Unit]
179+
Description=jendeley JSON-based document organization software
180+
181+
[Service]
182+
ExecStart=<PATH_TO_NODE>/node/v18.16.0/lib/node_modules/@a_kawashiro/jendeley/run_ollama.sh
183+
184+
[Install]
185+
WantedBy=default.target
186+
$ systemctl --user enable ollama-jendeley
187+
$ systemctl --user start ollama-jendeley
188+
```
189+
190+
To check the LLM server's status, you can use the following command.
191+
```console
192+
$ journalctl --user -f -u ollama-jendeley.service
193+
```
194+
174195
## Contact me
175196
You can find me on Twitter at [https://twitter.com/a_kawashiro](https://twitter.com/a_kawashiro) and on Mastodon at [https://mstdn.jp/@a_kawashiro](https://mstdn.jp/@a_kawashiro). Additional contact information can be found on my website at [https://akawashiro.github.io/#links](https://akawashiro.github.io/#links). Also, feel free to create an issue or submit a pull request on [the repository](https://github.com/akawashiro/jendeley).
176197

jendeley-backend/run_ollama.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ docker run \
1818
-v ${ollama_dir}:/root/.ollama \
1919
ollama/ollama
2020
sleep 2
21-
docker exec -it ${ollama_container_name} bash -c "ollama pull llama3.2"
21+
docker exec ${ollama_container_name} bash -c "ollama pull llama3.2"
2222
curl http://localhost:11434/api/generate -d '{
2323
"model": "llama3.2",
2424
"prompt": "What is 1 + 1?",

0 commit comments

Comments
 (0)