其他分享
首页 > 其他分享> > 初始化SparkContext时jvm错误中不存在pyspark错误

初始化SparkContext时jvm错误中不存在pyspark错误

作者:互联网

我在emr上使用spark并编写了pyspark脚本,
尝试执行时出现错误

from pyspark import SparkContext
sc = SparkContext()

这是错误

File "pyex.py", line 5, in <module>
    sc = SparkContext()   File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)   File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
    self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc)   File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
    "{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

我发现this的答案说明我需要导入sparkcontext,但这也无法正常工作.

解决方法:

PySpark最近发布了2.4.0,但是没有一个稳定的版本可以与此新版本同时出现.尝试降级到pyspark 2.3.2,这对我来说已解决

编辑:更清楚地说,您的PySpark版本必须与下载的Apache Spark版本相同,否则您可能会遇到兼容性问题

通过使用检查pyspark的版本

pip freeze

标签:amazon-emr,python,apache-spark,python-3-x,pyspark
来源: https://codeday.me/bug/20191010/1888334.html