Can not change the log level in spark app

Code:

package SparkPkg01

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
import org.apache.spark.sql.types._
import org.apache.spark.sql.AnalysisException
import org.apache.spark.sql.DataFrame
import org.apache.log4j.{Level, Logger}

object sparkTest01 extends App {

Logger.getLogger("org").setLevel(Level.ERROR)

val spark = SparkSession.builder.master("local[2]").
config("hive.metastore.uris", "thrift://localhost:9083").
enableHiveSupport().
appName("SparkTest").
getOrCreate()

spark.sparkContext.setLogLevel("ERROR")

println("Start work ...")

spark.sql("select * from default.test_001").show
spark.stop()

}

Log:

"C:\Program Files\Java\jdk1.8.0_144\bin\java.exe" "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2018.2.2\lib\idea_rt.jar=49272:D:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2018.2.2\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_144\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_144\jre\lib\deploy.jar;C:\Program ..." SparkPkg01.sparkTest01
Console output is saving to: C:\
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/devuser01/.ivy2/cache/ch.qos.logback/logback-classic/jars/logback-classic-1.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/devuser01/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
13:48:52.446 [main] INFO org.apache.spark.SparkContext - Running Spark version 2.1.0
13:48:52.534 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
13:48:52.546 [main] DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
13:48:52.547 [main] DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
13:48:52.665 [main] DEBUG org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty
13:48:52.667 [main] DEBUG org.apache.hadoop.security.Groups - Creating new Groups object
13:48:52.669 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
13:48:52.673 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader - Loaded the native-hadoop library
...

/**********************************************/
Too much log information.
I followed the below url:

https://stackoverflow.com/questions/31951728/how-to-set-up-logging-level-for-spark-application-in-intellij-idea?noredirect=1&lq=1

none of them are working for me.

 


Please advise, thanks,

 

请先登录再写评论。