How do i setup Pyspark in Python 3 with spark-env.sh.template -


because have issue in ipython3 notebook, guess have change "spark-env.sh.template" somehow.

exception: python in worker has different version 2.7 in driver 3.4, pyspark cannot run different minor versions

spark not yet work python 3.if wish use python api need python interpreter (version 2.6 or newer).

i had same issue when running ipython=1 ./pyspark.

ok quick fix

edit vim pyspark , change pyspark_driver_python="ipython" line

pyspark_driver_python="ipython2" 

that's it.

if want check dose ipython points to,

type which ipython in terminal , bet that'll be

/library/frameworks/python.framework/versions/3.4/bin/ipython 

**updated**

the latest version of spark works python 3. may not need latest version.

just set environment variable:

export pyspark_python=python3

in case want change permanent add line pyspark script


Comments

Popular posts from this blog

timeout - Handshake_timeout on RabbitMQ using python and pika from remote vm -

gcc - MinGW's ld cannot perform PE operations on non PE output file -

c# - Search and Add Comment with OpenXML for Word -