Skip to content

Commit 0e375a3

Browse files
committed
Add assmebly plug in links
1 parent 6371feb commit 0e375a3

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

docs/quick-start.md

+9-6
Original file line numberDiff line numberDiff line change
@@ -294,12 +294,15 @@ There are a few additional considerations when running jobs on a
294294

295295
### Including Your Dependencies
296296
If your code depends on other projects, you will need to ensure they are also
297-
present on the slave nodes. The most common way to do this is to create an
298-
assembly jar (or "uber" jar) containing your code and its dependencies. You
299-
may then submit the assembly jar when creating a SparkContext object. If you
300-
do this, you should make Spark itself a `provided` dependency, since it will
301-
already be present on the slave nodes. It is also possible to submit your
302-
dependent jars one-by-one when creating a SparkContext.
297+
present on the slave nodes. A popular approach is to create an
298+
assembly jar (or "uber" jar) containing your code and its dependencies. Both
299+
[sbt](https://github.com/sbt/sbt-assembly) and
300+
[Maven](http://maven.apache.org/plugins/maven-assembly-plugin/)
301+
have assembly plugins. When creating assembly jars, list Spark
302+
itself as a `provided` dependency; it need not be bundled since it is
303+
already present on the slaves. Once you have an assembled jar,
304+
add it to the SparkContext as shown here. It is also possible to submit
305+
your dependent jars one-by-one when creating a SparkContext.
303306

304307
### Setting Configuration Options
305308
Spark includes several configuration options which influence the behavior

0 commit comments

Comments
 (0)