Skip to content

Commit 421dcee

Browse files
authored
Update README.md (#4)
Some fixes were made with 0.2.0 but the readme wasn't updated. This fixes that.
1 parent 30e5317 commit 421dcee

File tree

1 file changed

+5
-15
lines changed

1 file changed

+5
-15
lines changed

README.md

Lines changed: 5 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -13,19 +13,15 @@ Weave is a "multiversal" generative tree writing tool akin to [`loom`](https://g
1313

1414
Notable features:
1515

16-
- **Live switching of backends** - It's possible to generate part of a story
16+
- **Live switching of backends** - Generate part of a story
1717
with OpenAI and another part with LLaMA -- all without restarting the app.
18-
- **Streaming responses** - It's possible to cancel generations in progress --
18+
- **Streaming responses** - Cancel generations in progress --
1919
both local and online.
20-
- **Live editing** - It's possible to edit posts during generation, but not to
21-
add or remove nodes, so you need not wait for generation to complete to tweak
22-
the text to your liking. New tokens are always appended to the end.
20+
- **Live editing** - Edit posts during generation. New tokens are always appended to the end.
21+
- **Advanced sampling controls** - For local language models. Use any sampling methods in any order.
2322

2423
Coming soon:
2524

26-
- Fine-grained support over sampling for local models and potentially remote as
27-
well for backends returning logprobs. The backend code is already written in
28-
`drama_llama` but this is not exposed.
2925
- Keyboard shortcuts.
3026

3127
Additionally, one goal of `weave` is feature parity with [`loom`](https://github.com/socketteer/loom?tab=readme-ov-file#features).
@@ -46,7 +42,7 @@ Additionally, one goal of `weave` is feature parity with [`loom`](https://github
4642
- 🔲 'Visited' state
4743
- ☑️ Generation
4844
- 🔲 Generate N children with various models (currently one a time).
49-
- ☑️ Modify generation settings (Complete for OpenAI but not yet from LLaMA)
45+
- Modify generation settings (Complete for OpenAI but not yet from LLaMA)
5046
- ☑️ File I/O
5147
- ✅ Serializable application state, including stories, to JSON.
5248
- ✅ Open/save trees as JSON files
@@ -69,9 +65,3 @@ Additionally, one goal of `weave` is feature parity with [`loom`](https://github
6965
nodes are implemented with [`egui::containers::Window`](https://docs.rs/egui/latest/egui/containers/struct.Window.html) which ignore scrollable areas. This is fixable
7066
but not easily and not cleanly. When it is resolved the central panel will be
7167
split into story and node views.
72-
- The `drama_llama` backend will crash if the model's output is not valid
73-
unicode. This will be fixed. If this happens, go to settings, switch backends,
74-
and then switch back `drama_llama`.
75-
- The BOS token is not added for the `drama_llama` backend. This will be added
76-
as an option and enabled by default since most models expect it. Generation
77-
will still work but the quality may be affected.

0 commit comments

Comments
 (0)