[Spark] CentOS6.7安装独立版Spark

我就是我 2022-06-13 09:24 275阅读 0赞
  1. 系统环境:
    Linux:CentOS6.7
    JDK: jdk-1.8.0_131
    Scala: Scala-2.12.2
    Spark: Spark-2.1.1
  1. 安装JDK
    安装JDK8并设置环境变量

    [root@localhost local]# java -version
    java version “1.8.0_131”
    Java(TM) SE Runtime Environment (build 1.8.0_131-b11)
    Java HotSpot(TM) 64-Bit Server VM (build 25.131-b11, mixed mode)
    [root@localhost local]# echo $JAVA_HOME
    /usr/java/jdk1.8.0_131
    [root@localhost local]#

  2. 安装Scala
    1.1 下载解压

    [root@localhost ~]# wget https://downloads.lightbend.com/scala/2.12.2/scala-2.12.2.tgz
    [root@localhost ~]# cp scala-2.12.2.tgz /usr/local/
    [root@localhost local]# tar -zxvf scala-2.12.2.tgz
    [root@localhost local]# mv scala-2.12.2 scala

2.1 配置环境变量

  1. [root@localhost local]# vi /etc/profile

写入

  1. export SCALA_HOME=/usr/local/scala
  2. export PATH=$PATH:$SCALA_HOME/bin

使配置生效

  1. [root@localhost local]# source /etc/profile

3.1 验证scala安装成功

  1. [root@localhost local]# scala version
  2. [root@localhost local]# scala
  3. Welcome to Scala 2.12.2 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131).
  4. Type in expressions for evaluation. Or try :help.
  5. scala>
  1. 安装spark
    4.1 下载解压
    Spark官网:http://spark.apache.org/downloads.html

Spark安装包有两种选择,一种是下载预编译好的Spark,一种是源码安装。
如果想从零开始编译,则下载源码包。但是不建议这样做,因为编译的时候需要联网编译(编译工具Maven或者sbt),而Maven仓库被墙了,导致编译时需要翻墙,非常麻烦。在尝试完Maven和sbt之后,我选择下载预编译好的Spark。

Center
选择Spark版本之后,下面Note部分会提示支持对应版本的Spark的Scala版本范围,安装Spark之前确认Scala版本是否符合要求。

  1. [root@localhost ~]# cp spark-2.1.1-bin-hadoop2.7.tgz /usr/local/
  2. [root@localhost ~]# cd /usr/local/
  3. [root@localhost local]# tar -zxvf spark-2.1.1-bin-hadoop2.7.tgz
  4. [root@localhost local]# mv spark-2.1.1-bin-hadoop2.7 spark

5.1 配置环境变量

  1. [root@localhost conf]# vi /etc/profile

写入

  1. export SPARK_HOME=/usr/local/spark
  2. export PATH=$PATH:$SPARK_HOME/bin

生效配置

  1. [root@localhost conf]# source /etc/profile

6.1 设置SPARK_EXAMPLES_JAR环境变量

  1. [root@localhost local]# vi ~/.bash_profile
  2. [root@localhost local]# vi /etc/profile

写入

  1. export SPARK_EXAMPLES_JAR=$SPARK_HOME/examples/jars/spark-examples_2.11-2.1.1.jar

生效配置

  1. [root@localhost conf]# source /etc/profile

7.1 修改spark配置文件
1.1.1 配置spark.sh

  1. [root@localhost local]# vi /etc/profile.d/spark.sh

写入

  1. export JAVA_HOME=/usr/java/jdk1.8.0_131
  2. export SCALA_HOME=/usr/local/scala

使配置生效

  1. [root@localhost spark]# source /etc/profile.d/spark.sh

2.1.1 配置spark-env.sh

  1. [root@localhost conf]# cp /usr/local/spark/conf/spark-env.sh.template /usr/local/spark/conf/spark-env.sh
  2. [root@localhost conf]# vi /usr/local/spark/conf/spark-env.sh
  3. export SCALA_HOME=/usr/local/scala
  4. export JAVA_HOME=/usr/java/jdk1.8.0_131

8.1 启动spark

  1. [root@localhost spark]# sbin/start-all.sh
  2. starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
  3. localhost: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
  4. [root@localhost spark]#

发表评论

表情:
评论列表 (有 0 条评论,275人围观)

还没有评论,来说两句吧...

相关阅读

    相关 Spark安装

    前提:安装好hadoop和jdk 1.下载Scala和Spark ![这里写图片描述][20161116120843754] 在spark官网选择好自己要安装的版本