- Spark Cookbook
- Rishi Yadav
- 248字
- 2021-07-16 13:44:00
Developing Spark applications in Eclipse with SBT
Simple Build Tool (SBT) is a build tool made especially for Scala-based development. SBT follows Maven-based naming conventions and declarative dependency management.
SBT provides the following enhancements over Maven:
- Dependencies are in the form of key-value pairs in the
build.sbt
file as opposed topom.xml
in Maven - It provides a shell that makes it very handy to perform build operations
- For simple projects without dependencies, you do not even need the
build.sbt
file
In build.sbt
, the first line is the project definition:
lazy val root = (project in file("."))
Each project has an immutable map of key-value pairs. This map is changed by settings in SBT like so:
lazy val root = (project in file(".")) settings( name := "wordcount" )
Every change in the settings leads to a new map, as it's an immutable map.
How to do it...
Here's how we go about adding the sbteclipse
plugin:
- Add this to the global plugin file:
$ mkdir /home/hduser/.sbt/0.13/plugins $ echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0" ) > /home/hduser/.sbt/0.12/plugins/plugin.sbt
Alternatively, you can add the following to your project:
$ cd <project-home> $ echo addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.5.0" ) > plugin.sbt
- Start the
sbt
shell without any arguments:$sbt
- Type
eclipse
and it will make an Eclipse-ready project:$ eclipse
- Now you can navigate to File | Import | Import existing project into workspace to load the project into Eclipse.
Now you can develop the Spark application in Scala using Eclipse and SBT.