forked from mesos/spark
-
Notifications
You must be signed in to change notification settings - Fork 0
Running Spark Demo Guide
andyk edited this page Aug 31, 2012
·
7 revisions
-
Create directory for spark demo:
~$ mkdir SparkTest
-
Copy the sbt files in ~/spark/sbt directory:
~/SparkTest$ cp -r ../spark/sbt .
-
Edit the ~/SparkTest/sbt/sbt file to look like this:
#!/bin/bash java -Xmx800M -XX:MaxPermSize=150m -jar $(dirname $0)/sbt-launch-*.jar "$@"
-
To build a Spark application, you need Spark and its dependencies in a single Java archive (JAR) file. Create this JAR in Spark's main directory with sbt as:
~/spark$ sbt/sbt assembly
-
create a source file in ~/SparkTest/src/main/scala directory:
~/SparkTest/src/main/scala$ vi Test1.scala
-
Make the contain of the Test1.scala file like this:
import spark.SparkContext import spark.SparkContext._ object Test1 { def main(args: Array[String]) { val sc = new SparkContext("local", "SparkTest") println(sc.parallelize(1 to 10).reduce(_ + _)) System.exit(0) } }
-
Run the Test1.scala file:
~/SparkTest$ sbt/sbt run