Execute PySpark code from PyCharm IDE on remote server

Answered

I am new to Pycharm and using 2017.3.2 professional version.

I am trying to achieve code execution on remote server. I want to do my Pyspark code locally and then execute the same on the remote Hadoop cluster installed on my VM.

I am able to achieve "upload code to cluster" by creating sftp configuration under deployment option and able to see my file on hadoop cluster. But I am unable to execute the code on cluster from the IDE.

Could someone please help me if this is possible.

2 comments
Comment actions Permalink

We need the pycharm enterprise edition??
Also what happend if i have all xml and config files from hadoop cluster on my local machine?? I suppose i will be able to connect and execute pyspark code on my remote hadoop cluster right??

Many thanks

0

Please sign in to leave a comment.