Skip to content

BUG: Disable Numpy memory allocation while concat #59957

@sandeyshc

Description

@sandeyshc

Pandas version checks

  • I have checked that this issue has not already been reported.

  • I have confirmed this bug exists on the latest version of pandas.

  • I have confirmed this bug exists on the main branch of pandas.

Reproducible Example

import pandas as pd
pd.concat(df_list,axis=1)

Issue Description

We have sparse data with many null values, and while reading it using Pandas with PyArrow, it doesn't consume much memory because of pandas internal compression logic. However, during concatenation, NumPy allocates memory that isn't actually used, causing our Python script to fail due to memory allocation issues. Can you provide an option to disable NumPy memory allocation when concatenating DataFrames along axis=1?

Expected Behavior

Numpy should not allocate memory when it is not used

Installed Versions

pandas : 2.2.2
numpy : 2.1.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    BugNeeds TriageIssue that has not been reviewed by a pandas team member

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions