Description
Code Sample, a copy-pastable example if possible
import pandas as pd
df = pd.read_csv('file.csv', index_col=0, parse_dates=True)
while True:
df['close'].rolling(4000).max()
Problem description
Memory leak which shuts down my application. This occurs in pandas 0.24.2 but not in pandas 0.23.4. My 16 GB memory gets filled in a few hours of running this code. file.csv is attached in zip, the memory leak might only occur on certain data.
file.csv.zip
Expected Output
No memory leaks.
Output of pd.show_versions()
pandas: 0.24.2
pytest: None
pip: 19.0.3
setuptools: 38.4.0
Cython: None
numpy: 1.14.5
scipy: None
pyarrow: None
xarray: None
IPython: 7.1.1
sphinx: None
patsy: None
dateutil: 2.7.5
pytz: 2018.5
blosc: None
bottleneck: None
tables: None
numexpr: None
feather: None
matplotlib: None
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml.etree: None
bs4: 4.7.1
html5lib: None
sqlalchemy: 1.2.14
pymysql: None
psycopg2: 2.7.6.1 (dt dec pq3 ext lo64)
jinja2: 2.10
s3fs: None
fastparquet: None
pandas_gbq: None
pandas_datareader: None
gcsfs: None