Problem resolving Spark dependencies inferred from build.sbt

I'm using build 142.5439 and Scala plugin 1.9.1, although I already had this problem with 14.1 with plugin 1.5.4.

My build.sbt:

name := "BugRepro1"

version := "0.6.0-SNAPSHOT"

organization := "com.github.spirom"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.0" % "provided"

My only source file:

import org.slf4j.{Logger, LoggerFactory}

Well yes, there's more, but this is all you need, as neither dependency can be resolved. It probably has something to do with the following warning:


8:10:03 AM SBT project import
           [warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version:
           [warn]  * org.scala-lang:scala-compiler:(2.10.0, 2.10.4)
           [warn]  * org.apache.commons:commons-lang3:(3.3.2, 3.0)
           [warn]  * org.slf4j:slf4j-api:(1.7.10, 1.7.2)

But, it works in raw SBT. I assume it's a bug in the Scala plugin? Any coping strategies? I struggle with problems like this every time I need to integrate with a new version of Spark while continuing to use Idea, but this is the first time I haven't been able to beat it into submission by drinking lots of coffee, deleting module dependencies, clearing the cache and restarting a few times until I get lucky. (If people have general coping strategies for this sort of thing, I'm _really_ interested too.)

Thanks!

3 comments
Comment actions Permalink

Hi! I'm not quite following your problem with Scala plugin. Could you please re-formulate your problem with the following pattern:

I have a problem with (importing|compiling|testing|whatever) my project. It fails with the following error <error message here> when I <describe what you're trying to do with your project>. I expect it to (import|compile|test|whatever).

Thanks!

0
Comment actions Permalink

Apologies Nikolay.

Background:

I maintain an SBT-based scala project of moderate complexity, which depends on the Apache Spark JARs. Many months, and several Spark versions ago, I created an Idea project from this build.sbt, and checked it into GitHub with both the build.sbt and the Idea configuration files. I maintain the project in that form, because I like using Idea for development, but it's also useful to be able to do raw sbt builds.

My problem:

Recently, I updated the project to depend on Apache 1.5.0, simply by editing the two dependency entries in the build.sbt. I recompiled the project using sbt, fixed a couple of obvious compilation failures, and all was well.

Then I refreshed the build.sbt in Idea and recompiled the project in Idea. Almost everything was fine, except that Idea was unable to resolve the following import in one of the source files:

import org.slf4j.{Logger, LoggerFactory}

The messages I get are:

Object Logger is not a member of package org.slf4j

Object LoggerFactory is not a member of org.slf4j

Reproducing my problem:

My original post provides a cut down build.sb and a single line of code that exhibit the problem. Simply create an sbt project from that build.sbt, add an empty Scala source file and paste in my single import line. I claim it will fail to compile on both the latest 14.1 and the latest 15 EAP.  
 
What I tried:

  • Deleting all the generated dependencies in the generated Idea module and refreshign the build.sbt (this has helped in similar cases before but didn't help now)
  • Cleared my Ivy cache (didn't help but it's a well known technique)
  • Cleared the IUdea cache and restarted (often helps with such problems but didn't this time)
  • Upgrading from 14.1 to the latest 15 EAP (didn't help -- I do like it better though -- Congratulations!)
  • [Most recently:] added a new SBT dependency libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.10" % "provided" -- this actually works, but shouldn't be necessary, because the Spark JARs depend on it already.

What I hoped would happen instead:

I don't know how technically feasible this is, but I would hope that whenever the project compiles successfully and runs using sbt, it would also compile successfully and run in Idea.

I was also hoping someone could suggest a work around, but I just found one myself -- see above.

A general observation about Idea's support for build.sbt:

I've been having this kind of problem a LOT, with several projects I work on. I upgrade the version numbers in some dependency in build.sbt, then raw sbt can immediately rebuild the project, but I have to struggle with Idea for hours or days before I can get it to compute the right dependencies and rebuild the project. Am I the only person having these problems?

What I think might be going wrong:

I did get a warning when refreshing the build.sbt:

8:10:03 AM SBT project import
           [warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version:
           [warn]  * org.scala-lang:scala-compiler:(2.10.0, 2.10.4)
           [warn]  * org.apache.commons:commons-lang3:(3.3.2, 3.0)
           [warn]  * org.slf4j:slf4j-api:(1.7.10, 1.7.2)

Notice that the last one is the JAR I'm trying to import from. Apparently Spark depends, transitively, on more than one version of that JAR. My suspicion is that while raw sbt is able to deal with this problem, Idea's support for sbt is not.

Thanks and best regards!
0
Comment actions Permalink

Thanks, that's very detailed. The solution is to use the latest SBT which is 0.13.9. To do so add "sbt.version=0.13.9" line into "project/build.properties" file in your project's directory and then update project in IDEA.

Apparently it was SBT's bug: though slf4j-api 1.7.2 was evicted, it was still included in `externalDependencyClasspath` list instead of newer 1.7.10. Because IDEA uses that list to get module dependencies, it was thinking that you're using 1.7.2, but there were no artifacts in corresponding project library because 1.7.2 was actually evicted. I decided not to report this bug as it seems to be fixed in 0.13.9, but I need your response to be sure. Please, try to take steps I mentioned above and tell me if everything's fine now.

0

Please sign in to leave a comment.