-
-
Notifications
You must be signed in to change notification settings - Fork 11
Add CSV stringify/serialize functionalityΒ #517
Copy link
Copy link
Open
Labels
enhancementNew feature or requestNew feature or requesthelp wantedExtra attention is neededExtra attention is neededjavascriptPull requests that update Javascript codePull requests that update Javascript code
Description
Description
The library currently focuses on parsing CSV, but lacks the inverse operation: converting JavaScript objects to CSV strings. Adding a stringify or serialize function would make this a complete CSV toolkit.
Proposed API
High-level API
import { stringify } from 'web-csv-toolbox';
const records = [
{ name: 'Alice', age: '42', city: 'Tokyo' },
{ name: 'Bob', age: '69', city: 'Osaka' }
];
// Convert to CSV string
const csv = await stringify(records);
console.log(csv);
// Output:
// name,age,city
// Alice,42,Tokyo
// Bob,69,Osaka
// Or stream output
for await (const chunk of stringify.toStream(records)) {
console.log(chunk);
}Options
interface StringifyOptions<
Delimiter extends string = ',',
Quotation extends string = '"'
> {
delimiter?: Delimiter;
quotation?: Quotation;
headers?: string[] | false; // false = no header row
quote?: 'auto' | 'all' | 'none'; // when to quote fields
lineEnding?: '\n' | '\r\n';
}Implementation Approach
1. Leverage existing escapeField function
src/escapeField.tsalready handles field escaping logic- Use it to properly quote/escape fields
2. Create core stringify functions
src/
stringify.ts # High-level API
stringifyToStream.ts # Stream-based output
stringifySync.ts # Synchronous version
3. Handle different input types
- Array of objects (most common)
- Array of arrays
- AsyncIterableIterator of records (for streaming large datasets)
Use Cases
- Export data from web apps: Generate CSV downloads in browsers
- API responses: Convert query results to CSV format
- Data transformation pipelines: Parse β Transform β Stringify
- Testing: Generate test CSV data programmatically
Example: Browser Download
const records = await fetchDataFromAPI();
const csv = await stringify(records);
const blob = new Blob([csv], { type: 'text/csv' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = 'export.csv';
a.click();Example: Streaming Large Datasets
async function* generateRecords() {
for (let i = 0; i < 1000000; i++) {
yield { id: i, value: Math.random() };
}
}
// Memory-efficient: doesn't load all records at once
for await (const chunk of stringify.toStream(generateRecords())) {
await writeToFile(chunk);
}Challenges to Consider
- Type safety: Infer CSV structure from record types
- Performance: Efficient for large datasets
- Consistency: Mirror parsing options (delimiter, quotation)
- Edge cases: undefined/null values, special characters
References
- Papa Parse
unparse(): https://www.papaparse.com/docs#json-to-csv - csv-stringify (Node.js): https://csv.js.org/stringify/
- Python csv.writer
Acceptance Criteria
- Implement
stringify()function - Support common options (delimiter, quotation, headers)
- Handle field escaping correctly (use
escapeField) - Streaming version for memory efficiency
- TypeScript type inference from input
- Comprehensive tests (including property-based)
- Documentation and examples
Enhancement: Major feature addition that completes the CSV toolbox!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requesthelp wantedExtra attention is neededExtra attention is neededjavascriptPull requests that update Javascript codePull requests that update Javascript code