Skip to content

Add streaming parser for large message handling #197

@konard

Description

@konard

Summary

Request to add a streaming parser API to links-notation for handling large messages efficiently.

Background

When building links-queue, we need to parse potentially large Links Notation messages for queue operations. A streaming parser would allow processing data as it arrives without loading the entire message into memory.

Use Case

// Desired API for JavaScript
const parser = new StreamParser();

parser.on('link', (link) => {
  // Process each link as it's parsed
  console.log(link);
});

parser.on('error', (error) => {
  // Handle parse errors with location info
  console.error(`Error at line ${error.line}, col ${error.column}: ${error.message}`);
});

// Feed data incrementally
parser.write(chunk1);
parser.write(chunk2);
parser.end();
// Desired API for Rust
let mut parser = StreamParser::new();

parser.on_link(|link| {
    println!("{:?}", link);
});

parser.write(chunk1)?;
parser.write(chunk2)?;
let links = parser.finish()?;

Benefits

  1. Memory efficiency: Process large messages without loading everything into memory
  2. Latency: Start processing before the full message is received
  3. Network integration: Natural fit for TCP streaming in links-queue server mode

Workaround

Currently, we would need to buffer the entire message before parsing, which is less efficient for large payloads.

Additional Feature Request

It would also be helpful to have detailed error reporting with line/column information for debugging parse errors.

References

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentationenhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions