-
|
I have something like this: #[derive(BinRead)]
enum Op {
#[br(magic=b"O")]
One,
#[br(magic=b"T")]
Two,
}When decoding, if there aren't enough bytes to match the Op::read_be(&mut std::io::Cursor(&[])) // this will be a BadMagic error, not an EOFInterestingly, if I make the enum open-ended, then I can get an EOF: #[derive(BinRead)]
enum Op {
#[br(magic=b"O")]
One,
#[br(magic=b"T")]
Two,
Other(u8),
}Presumably, this is because the empty byte array fails to match each variant sequentially, and then fails because there isn't enough bytes to fill that The magics all have the same size, is it possible to specify how many bytes to expect before trying to match the magics? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
Hi, thanks for your question! The problem of returning EOF for variant parsing is complicated because it is an ambiguous failure condition. #217 (comment) describes the situation as it applies to data enums with It does seem undesirable to me that we are returning |
Beta Was this translation helpful? Give feedback.
Hi, thanks for your question!
The problem of returning EOF for variant parsing is complicated because it is an ambiguous failure condition. #217 (comment) describes the situation as it applies to data enums with
pre_assert, and similar issues exist here since magic is allowed to have heterogenous types which means maybe the author wants EOF when any magic EOF, or maybe the author wants EOF when the shortest magic EOF. Your idea of hinting the desired size is on the right track in this regard, but whatever solution should probably be consistent across all enum parsing, so needs to consider the situation from #217 too.It does seem undesirable to me that we are returning
NoVariantMatchwhen a…