Installation of Spark Shell

Hi viewers,

This material is providing you a way to install a single Node Spark installation. Follow the steps clearly and get spark installed on your own machines. And start working in spark.

spark_shell_installation

1. Download spark from the below link. If you are using linux os then just use the below wget command to get the spark version. If you need latest version refer official Spark Site
$ wget http://d3kbcqa49mib13.cloudfront.net/spark-1.3.0.tgz

2. To start spark in your machine we need scala. To download scala use the below wget command. And wait untill scala gets downloaded in your machine. If you need latest version of scala enter in to official scala Site
$ wget http://www.scala-lang.org/files/archive/scala-2.10.4.tgz

3. After getting the spark-1.3.0.tgz and scala-2.10.4.tgz on your machine just untar both the file.

$ tar -zxvf spark-1.3.0.tgz
$ tar -zxvf scala-2.10.4.tgz

After untaring spark and scala in your machine, we need to set the scala and spark home on bashrc. Set the home path in bashrc as shown below. After editing your .bashrc file, we need to run or source our .bashrc file.
$ vi .bashrc
export SCALA_HOME=/home/bigdata/scala-2.10.4
export SPARK_HOME=/home/bigdata/spark-1.3.0
export PATH=$HOME/bin:$SCALA_HOME/bin:$PATH
$ source .bashrc

git Install
========
$ sudo apt-get install git
$ cd spark-1.3.0
$ sbt/sbt assembly
$ bin/spark-shell

Browser : localhost:4040

———————————-

Article written by DataDotz Team

DataDotz is a Chennai based BigData Team primarily focussed on consulting and training on technologies such as Apache Hadoop, Apache Spark , NoSQL(HBase, Cassandra, MongoDB), Search and Cloud Computing.

Note: DataDotz also provides classroom based Apache Kafka training in Chennai. The Course includes Cassandra , MongoDB, Scala and Apache Spark Training. For more details related to Apache Spark training in Chennai, please visit http://datadotz.com/training/