-
-
Notifications
You must be signed in to change notification settings - Fork 2
Labels
documentationImprovements or additions to documentationImprovements or additions to documentationenhancementNew feature or requestNew feature or request
Description
Summary
Request to add a streaming parser API to links-notation for handling large messages efficiently.
Background
When building links-queue, we need to parse potentially large Links Notation messages for queue operations. A streaming parser would allow processing data as it arrives without loading the entire message into memory.
Use Case
// Desired API for JavaScript
const parser = new StreamParser();
parser.on('link', (link) => {
// Process each link as it's parsed
console.log(link);
});
parser.on('error', (error) => {
// Handle parse errors with location info
console.error(`Error at line ${error.line}, col ${error.column}: ${error.message}`);
});
// Feed data incrementally
parser.write(chunk1);
parser.write(chunk2);
parser.end();// Desired API for Rust
let mut parser = StreamParser::new();
parser.on_link(|link| {
println!("{:?}", link);
});
parser.write(chunk1)?;
parser.write(chunk2)?;
let links = parser.finish()?;Benefits
- Memory efficiency: Process large messages without loading everything into memory
- Latency: Start processing before the full message is received
- Network integration: Natural fit for TCP streaming in links-queue server mode
Workaround
Currently, we would need to buffer the entire message before parsing, which is less efficient for large payloads.
Additional Feature Request
It would also be helpful to have detailed error reporting with line/column information for debugging parse errors.
References
- links-queue issue #16 - Links Notation integration
Metadata
Metadata
Assignees
Labels
documentationImprovements or additions to documentationImprovements or additions to documentationenhancementNew feature or requestNew feature or request