Jupyterhub remote execution kernel reconnect?

I do a lot of app development with PyCharm and I am a big fan.

I used to do modelling in Jupyterhub, but not having proper source control, all the code-completion and help of PyCharm has always bothered me. Just recently I have realized that I can connect my PyCharm to the remote Jupyterhub server and so write the .ipynb files locally in PyCharm and execute them on the Jupyterhub server remotely. This is great, BUT I can't figure out a way to reconnect to a given kernel after disconnecting (PyCharm restart, PC restart, etc). Let me provide an example.

Let's assume that I have some long-running modelling (like some deep learning stuff) running in the JupyterHub kernel executed from PyCharm and I shut down PyCharm. Then I come back later, reopen and reconnect PyCharm - but it seems I am not able to reconnect to the given execution. Let me provide a code example:

from time import sleep

print("starting")
sleep(30)
print("ending")

When executing the above and exiting PyCharm in that 30 seconds while the code is "sleeping" and reconnecting I never see the "ending" message in Preview.

 

This can be an issue when running deep learning, as that would be logging details I would like to see once reconnected and these deep learning jobs can be running for hours, sometimes for days, which means it is impossible not to disconnect at some point.

 

Any idea how to get this working? Is this even supported by the Jupyter REST API? If yes, could this use case be supported by PyCharm?  

 

 

2 comments
Comment actions Permalink

Hi,

You're right, I did some testing and I'm not able to connect to the existing kernel process after restarting PyCharm, this is also valid for plain Jupyter Notebook server.

I think this warrants a feature request to https://youtrack.jetbrains.com/issues

0
Comment actions Permalink

Hi,

Thank you for confirming.

I have raised a feature request at https://youtrack.jetbrains.com/issue/PY-43041

0

Please sign in to leave a comment.