-
-
Notifications
You must be signed in to change notification settings - Fork 18.8k
Description
Pandas version checks
-
I have checked that this issue has not already been reported.
-
I have confirmed this issue exists on the latest version of pandas.
-
I have confirmed this issue exists on the main branch of pandas.
Reproducible Example
Hello,
I was doing some memory profiling of an application that employs, among other libraries, Pandas. I have noticed it was using more than 50MB of memory just from imports, so I dug up and found that this line
import pandas._libs.pandas_parser
is the culprit.
Looking at the imported lib files they seem pretty small, so I wonder, what would be causing this memory blowup?
I have added some files for reproducibility. Machine and Python details below.
Installed Versions
INSTALLED VERSIONS
commit : c888af6
python : 3.12.11
python-bits : 64
OS : Windows
OS-release : 11
Version : 10.0.26100
machine : AMD64
processor : Intel64 Family 6 Model 158 Stepping 10, GenuineIntel
byteorder : little
LC_ALL : None
LANG : en_US.UTF-8
LOCALE : English_United Kingdom.1252
pandas : 2.3.1
numpy : 2.0.2
pytz : 2025.2
dateutil : 2.9.0.post0
pip : None
Cython : 3.1.2
sphinx : None
IPython : 9.3.0
adbc-driver-postgresql: None
adbc-driver-sqlite : None
bs4 : None
blosc : None
bottleneck : 1.5.0
dataframe-api-compat : None
fastparquet : None
fsspec : 2025.5.1
html5lib : None
hypothesis : None
gcsfs : None
jinja2 : 3.1.6
lxml.etree : None
matplotlib : 3.10.3
numba : 0.61.2
numexpr : None
odfpy : None
openpyxl : None
pandas_gbq : None
psycopg2 : None
pymysql : None
pyarrow : 20.0.0
pyreadstat : None
pytest : 8.4.1
python-calamine : None
pyxlsb : None
s3fs : None
scipy : 1.16.0
sqlalchemy : 2.0.41
tables : None
tabulate : 0.9.0
xarray : None
xlrd : 2.0.2
xlsxwriter : None
zstandard : None
tzdata : 2025.2
qtpy : None
pyqt5 : None
Prior Performance
No response