- Spark Cookbook
- Rishi Yadav
- 421字
- 2021-07-16 13:43:57
Installing Spark from binaries
Spark can be either built from the source code or precompiled binaries can be downloaded from http://spark.apache.org. For a standard use case, binaries are good enough, and this recipe will focus on installing Spark using binaries.
Getting ready
All the recipes in this book are developed using Ubuntu Linux but should work fine on any POSIX environment. Spark expects Java to be installed and the JAVA_HOME
environment variable to be set.
In Linux/Unix systems, there are certain standards for the location of files and directories, which we are going to follow in this book. The following is a quick cheat sheet:
How to do it...
At the time of writing this, Spark's current version is 1.4. Please check the latest version from Spark's download page at http://spark.apache.org/downloads.html. Binaries are developed with a most recent and stable version of Hadoop. To use a specific version of Hadoop, the recommended approach is to build from sources, which will be covered in the next recipe.
The following are the installation steps:
- Open the terminal and download binaries using the following command:
$ wget http://d3kbcqa49mib13.cloudfront.net/spark-1.4.0-bin-hadoop2.4.tgz
- Unpack binaries:
$ tar -zxf spark-1.4.0-bin-hadoop2.4.tgz
- Rename the folder containing binaries by stripping the version information:
$ sudo mv spark-1.4.0-bin-hadoop2.4 spark
- Move the configuration folder to the
/etc
folder so that it can be made a symbolic link later:$ sudo mv spark/conf/* /etc/spark
- Create your company-specific installation directory under
/opt
. As the recipes in this book are tested oninfoobjects
sandbox, we are going to useinfoobjects
as directory name. Create the/opt/infoobjects
directory:$ sudo mkdir -p /opt/infoobjects
- Move the
spark
directory to/opt/infoobjects
as it's an add-on software package:$ sudo mv spark /opt/infoobjects/
- Change the ownership of the
spark
home directory toroot
:$ sudo chown -R root:root /opt/infoobjects/spark
- Change permissions of the
spark
home directory,0755 = user:read-write-execute group:read-execute world:read-execute
:$ sudo chmod -R 755 /opt/infoobjects/spark
- Move to the
spark
home directory:$ cd /opt/infoobjects/spark
- Create the symbolic link:
$ sudo ln -s /etc/spark conf
- Append to
PATH
in.bashrc
:$ echo "export PATH=$PATH:/opt/infoobjects/spark/bin" >> /home/hduser/.bashrc
- Open a new terminal.
- Create the
log
directory in/var
:$ sudo mkdir -p /var/log/spark
- Make
hduser
the owner of the Sparklog
directory.$ sudo chown -R hduser:hduser /var/log/spark
- Create the Spark
tmp
directory:$ mkdir /tmp/spark
- Configure Spark with the help of the following command lines:
$ cd /etc/spark $ echo "export HADOOP_CONF_DIR=/opt/infoobjects/hadoop/etc/hadoop" >> spark-env.sh $ echo "export YARN_CONF_DIR=/opt/infoobjects/hadoop/etc/Hadoop" >> spark-env.sh $ echo "export SPARK_LOG_DIR=/var/log/spark" >> spark-env.sh $ echo "export SPARK_WORKER_DIR=/tmp/spark" >> spark-env.sh
- Cocos2D-X權(quán)威指南(第2版)
- SQL Server 2016從入門(mén)到精通(視頻教學(xué)超值版)
- Mastering ServiceNow(Second Edition)
- Hands-On Reinforcement Learning with Python
- Teaching with Google Classroom
- 深入理解Android:Wi-Fi、NFC和GPS卷
- HTML 5與CSS 3權(quán)威指南(第3版·上冊(cè))
- 零基礎(chǔ)Java學(xué)習(xí)筆記
- INSTANT Adobe Edge Inspect Starter
- 實(shí)戰(zhàn)Java高并發(fā)程序設(shè)計(jì)(第2版)
- 深度探索Go語(yǔ)言:對(duì)象模型與runtime的原理特性及應(yīng)用
- 硬件產(chǎn)品設(shè)計(jì)與開(kāi)發(fā):從原型到交付
- 零基礎(chǔ)學(xué)C++(升級(jí)版)
- UI動(dòng)效設(shè)計(jì)從入門(mén)到精通
- 少兒編程輕松學(xué)(全2冊(cè))