How do i setup Pyspark in Python 3 with spark-env.sh.template -


because have issue in ipython3 notebook, guess have change "spark-env.sh.template" somehow.

exception: python in worker has different version 2.7 in driver 3.4, pyspark cannot run different minor versions

spark not yet work python 3.if wish use python api need python interpreter (version 2.6 or newer).

i had same issue when running ipython=1 ./pyspark.

ok quick fix

edit vim pyspark , change pyspark_driver_python="ipython" line

pyspark_driver_python="ipython2" 

that's it.

if want check dose ipython points to,

type which ipython in terminal , bet that'll be

/library/frameworks/python.framework/versions/3.4/bin/ipython 

**updated**

the latest version of spark works python 3. may not need latest version.

just set environment variable:

export pyspark_python=python3

in case want change permanent add line pyspark script


Comments

Popular posts from this blog

powershell Start-Process exit code -1073741502 when used with Credential from a windows service environment -

twig - Using Twigbridge in a Laravel 5.1 Package -

c# - LINQ join Entities from HashSet's, Join vs Dictionary vs HashSet performance -