错误总结
作者:互联网
20/12/12 15:49:47 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala) 20/12/12 15:49:47 INFO SparkContext: Successfully stopped SparkContext Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:432) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868) at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860) at apple.TFIDF$.main(TFIDF.scala:11) at apple.TFIDF.main(TFIDF.scala)
val ss= SparkSession.builder().master("local").appName("hello").getOrCreate()
val sc=ss.sparkContext
sc.setLogLevel("ERROR")
这个报错是JVM申请的memory不够导致无法启动SparkContext
本地测试的话,可以直接在代码中conf里设置一下spark.testing.memory
val sparkConf = new SparkConf().set("spark.testing.memory", "2147480000")
val ss= SparkSession.builder().config(sparkConf).master("local").appName("tfidf").getOrCreate()
val sc=ss.sparkContext
标签:总结,SparkContext,错误,scala,SparkSession,apache,org,spark 来源: https://www.cnblogs.com/ShyPeanut/p/14125001.html