You may notice bunch of messages poping up, as showed below, on the console when you initiate spark console using pyspark.
How do you disable these messages and only show the errors?
[santhosh@localhost Downloads]$ pyspark
Python 2.7.5 (default, Sep 14 2016, 08:35:31)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
16/10/06 11:25:20 INFO spark.SparkContext: Running Spark version 1.6.0
16/10/06 11:25:21 INFO spark.SecurityManager: Changing view acls to: santhosh
16/10/06 11:25:21 INFO spark.SecurityManager: Changing modify acls to: santhosh
.
.
.
16/10/06 11:25:30 INFO storage.BlockManagerMaster: Registered BlockManager16/10/06 11:25:30 INFO scheduler.EventLoggingListener: Logging events to hdfs://localhost:8020/user/spark/applicationHistory/application_1475694807177_0010
16/10/06 11:25:31 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 1.6.0
/_/
Using Python version 2.7.5 (default, Sep 14 2016 08:35:31)
SparkContext available as sc, HiveContext available as sqlContext.
>>>
Solution:
Edit the file /etc/spark/conf/log4j.properties
and change the value for logger from INFO to ERROR
root.logger=INFO,console --> root.logger=ERROR,console
No comments:
Post a Comment