spark集群安装

    xiaoxiao2021-03-25  184

    一、Scala

    1、scala安装:http://www.scala-lang.org/download/2.10.6.html

    2、解压: tar -zxvf /home/tools/scala-2.10.6.tgz -C /home/softwares/

    3、配置环境变量:vi /etc/profile

    export SCALA_HOME=/home/softwares/scala-2.10.6

    export PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin

    Source /etc/profile

    输入scala 测试是否安装成功

    二、SPARK

    1、下载spark-1.6.2

    http://mirror.bit.edu.cn/apache/spark/spark-1.6.0/

    解压:tar -zxvf /home/tools/spark-1.6.2-bin-hadoop2.6.tgz -C /home/softwares/

    2、环境变量:

    export SCALA_HOME=/home/softwares/scala-2.10.6

    export SPARK_HOME=/home/softwares/spark-1.6.0-bin-hadoop2.6

    export PATH=$PATH:$JAVA_HOME/bin:$SCALA_HOME/bin:${SPARK_HOME}/bin:${SPARK_HOME}/sbin:$PATH

    3、conf/spark-env.sh:

    export JAVA_HOME=/home/softwares/jdk1.7.0_79

    export SCALA_HOME=/home/softwares/scala-2.10.6

    export HADOOP_HOME=/home/softwares/hadoop-2.6.0

    export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop

    export SPARK_MASTER_IP=59.67.152.31

    export SPARK_WORKER_MEMORY=2g

    export master=spark://59.67.152.31

    export SPARK_EXECUTOR_MEMORY=2g

    export SPARK_DRIVER_MEMORY=2g

    export SPARK_WORKDER_CORES=2

    4、conf/slaves(添加工作节点):

    Slave1.Hadoop

    Slave2.Hadoop

    5、conf/spark-defaults.conf

    spark.master                    spark://59.67.152.31:7077

    spark.executor.extraJavaOptions   -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"

    spark.eventLog.enabled           true

    spark.eventLog.dir               hdfs://59.67.152.31:8020/historyserverforSpark

    spark.yarn.historyServer.address   59.67.152.31:10020

    spark.history.fs.logDirectory       hdfs://59.67.152.31:8020/historyserverforSpark

    6、配置好之后将复制到其他机器中,进行相同操作(环境变量配置)

    scp -r /home/softwares/spark-1.6.2-bin-hadoop2.6/ root@59.67.152.33:/home/softwares/

    7、启动:

    sbin/start-all.sh或者bin/start-shell

     

     

    三、出现问题:在安装过程中遇到问题如下:

    [root@Master spark-1.6.0-bin-hadoop2.6]# sbin/start-all.sh  starting org.apache.spark.deploy.master.Master, logging to /home/softwares/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.out failed to launch org.apache.spark.deploy.master.Master: /bin/java: 没有那个文件或目录-bin-hadoop2.6/bin/spark-class: line 86: /home/softwares/jdk1.7.0_79 full log in /home/softwares/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.out Slave1.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /home/softwares/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out Slave2.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /home/softwares/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.out Slave2.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker: Slave1.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker: /bin/java: 没有那个文件或目录res/spark-1.6.0-bin-hadoop2.6/bin/spark-class: line 86: /home/softwares/jdk1.7.0_79 /bin/java: 没有那个文件或目录res/spark-1.6.0-bin-hadoop2.6/bin/spark-class: line 86: /home/softwares/jdk1.7.0_79 Slave2.Hadoop: full log in /home/softwares/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.out Slave1.Hadoop: full log in /home/softwares/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out [root@Master spark-1.6.0-bin-hadoop2.6]

     

        仔细检查配置文件conf/spark-env.sh 以及环境变量都正确,就是不知道什么原因,找了两天的错误,后来通过vi spark-env.sh进入配置文件,结果发现配置文件不正确,多了一些不知所以然的的字符,把这些字符删掉,重新启动正确。

       分析原因:可能实在网上找了了一些安装步骤,直接复制粘贴过去有一些格式出现问题,所以一定要细心,不要偷懒,一步一步仔细做,切勿急功近利。

    转载请注明原文地址: https://ju.6miu.com/read-510.html

    最新回复(0)