猿问
下载APP

安装spark必须要安装scala吗?

安装spark必须要安装scala吗


慕九州4526052
浏览 961回答 1
1回答

皈依舞

安装sparktar -zxvf spark-1.3.0-bin-hadoop2.3.tgzmkdir /usr/local/sparkmv spark-1.3.0-bin-hadoop2.3 /usr/local/sparkvim /etc/bashrcexport SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATHsource /etc/bashrccd /usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf/cp spark-env.sh.template spark-env.shvim spark-env.shexport JAVA_HOME=/javaexport SCALA_HOME=/usr/lib/scala/scala-2.10.5export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3export SPARK_MASTER_IP=192.168.137.101export SPARK_WORKER_MEMORY=1gexport HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoopexport SPARK_LIBRARY_PATH=$SPARK_HOME/libexport SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATHcp slaves.template slavesvim slaveshd1hd2hd3hd4hd57、分发scp /etc/bashrc hd2:/etcscp /etc/bashrc hd3:/etcscp /etc/bashrc hd4:/etcscp /etc/bashrc hd5:/etcscp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/7、 启动在hd1,启动cd $SPARK_HOME/sbin./start-all.sh
打开App,查看更多内容
随时随地看视频慕课网APP
我要回答