慕用0436605
2018-12-13 14:51
[hadoop@localhost spark-2.1.0]$ ./dev/make-distribution.sh –name 2.6.0-cdh5.7.0 –tgz -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.7.0 -Phive -Phive-thriftserver
+++ dirname ./dev/make-distribution.sh
++ cd ./dev/..
++ pwd
+ SPARK_HOME=/home/hadoop/source/spark-2.1.0
+ DISTDIR=/home/hadoop/source/spark-2.1.0/dist
+ MAKE_TGZ=false
+ MAKE_PIP=false
+ MAKE_R=false
+ NAME=none
+ MVN=/home/hadoop/source/spark-2.1.0/build/mvn
+ (( 8 ))
+ case $1 in
+ break
+ '[' -z /home/hadoop/app/jdk1.8.0_191 ']'
+ '[' -z /home/hadoop/app/jdk1.8.0_191 ']'
++ command -v git
+ '[' ']'
++ command -v /home/hadoop/source/spark-2.1.0/build/mvn
+ '[' '!' /home/hadoop/source/spark-2.1.0/build/mvn ']'
++ tail -n 1
++ grep -v INFO
++ /home/hadoop/source/spark-2.1.0/build/mvn help:evaluate -Dexpression=project.version $'\342\200\223name' 2.6.0-cdh5.7.0 $'\342\200\223tgz' -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.7.0 -Phive -Phive-thriftserver
+ VERSION=
[hadoop@localhost spark-2.1.0]$
自动又返回命令行输入了,为啥呀?
加个 -V 看看 日志。
Spark从零开始
52724 学习 · 81 问题
相似问题