I have to delete columns that used more than "k" bytes of memory. Is there any Python function to find the memory usage of each column of a Pandas DataFrame?
The memory_usage() function of Pandas can be used to find the memory usage of each column in bytes.
Here is an example of this function.
>>> import numpy as np>>> import pandas as pd>>> df = pd.DataFrame({'a':np.random.random(3), 'b':np.random.random(3), 'c':np.random.random(3)})>>> df a b c0 0.092695 0.744438 0.4531551 0.967226 0.786605 0.2027682 0.644772 0.589621 0.045076>>> m=df.memory_usage()>>> mIndex 128a 24b 24c 24dtype: int64>>> m['a']24>>> m['b']24>>> m['c']24