Replies: 1 comment 1 reply
-
Streaming the records from one file to another will use the smallest amount of memory. You can use Here is an example. void Main()
{
var config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
};
using var readStream = File.OpenRead("");
using var reader = new StreamReader(readStream);
using var csvReader = new CsvReader(reader, config);
using var writeStream = File.Open("", FileMode.Create);
using var writer = new StreamWriter(writeStream);
using var csvWriter = new CsvWriter(writer, config);
foreach (var foo in csvReader.GetRecords<Foo>())
{
// Convert from type Foo to type Bar.
var bar = new Bar
{
Id = foo.Id,
Name = foo.Name,
};
csvWriter.WriteRecord(bar);
}
}
private class Foo
{
public int Id { get; set; }
public string Name { get; set; }
}
private class Bar
{
public int Id { get; set; }
public string Name { get; set; }
} |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm working on converting large files (usually 1.5 GB ) incoming TSV files, parse the records, do some conversions and then write out them in another format. records and classmaps works fine. CSVHelper is a joy to work with! I will write a console application using .NET 6)
Before I start I am wondering if there are any best-practices people here like to share for doing this type of work?
Should the process go record-by-record or should the process use GetRecords?
I have read a lot about streams, memory mapped files etc and I understand there are many options for doing this. So I am asking in the context of CSVHelper, what have been working and what hasn't worked so well with large files.
I really appreciate any pointers to good samples, suggestions or advice :-)
Beta Was this translation helpful? Give feedback.
All reactions