When displaying data in dataframe format in pycharm's jupyter notebook, if the dataframe is too large, it will get stuck.
When displaying data in dataframe format in pycharm's jupyter notebook, if the dataframe is too large, it will get stuck, for example:
A dataframe named df_data is about 200MB in size. Run df_data in the cell, not print(df_data), and then wait for a while, other cells will be in a waiting state, but if it is df_data.head(10), This will not happen, indicating that a dataframe that is too large will affect the operation of pycharm's notebook. I spent the whole morning discovering this problem and updated all packages but it was not solved. Finally, I found that the dataframe was too large. Hope this problem can be optimized.
Please sign in to leave a comment.
Unfortunately, it's a known problem. Please join the discussion in the YouTrack thread: https://youtrack.jetbrains.com/issue/PY-65536/PyCharm-freezes-heavily-when-running-a-Jupyter-Notebook-cell-with-a-relatively-big-dataframe