Skip to content

PyDarshan: Killed when creating Darshan report for very large darshan logs  #779

@hammad45

Description

@hammad45

I am using PyDarshan to read and parse darshan dxt logs. The library works fine for small log files but when I try to generate a darshan report for very large log files, the process gets killed when generating the darshan report. I am using the following code to generate the report:

report = darshan.DarshanReport(self.args.darshan, read_all=True)

self.args.darshan contains arguments such as the darshan filename, start time, end time etc.

After running the code, I get a killed error message. I have also diagnosed the report.py code to see where the problem is. The execution stops at mod_read_all_dxt_records function probably because the file is too large to process and that is when I get the killed error message.

I have also attached the log files for which I am getting this issue. The files can be found here. Please let me know if anything else is required.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions