I'm having an issue in IntelliJ when debugging a function that launches a MapReduce job via Hadoop's job.waitForCompletion(), when hadoop is configured to run in pseudo mode. The Hadoop job tracker gets the job, but quickly throws a ClassNotFoundException on my custom mapper class. Turns out the Hadoop client API isnt passing the project jar to Hadoop. It looks like the IntelliJ launched JVM doesnt have the (maven) generated jar file for the project in the class path.
Is there a way I can configure the JVM process (launced by IntelliJ) to have the maven generated jar in the class path, so when I launch the Hadoop jar, it will pass the jar to the Hadoop job tracker?
BTW: If the hadoop configuration is configured to run in single mode, everything runs just fine (in process)