File tree 1 file changed +9
-6
lines changed
1 file changed +9
-6
lines changed Original file line number Diff line number Diff line change @@ -294,12 +294,15 @@ There are a few additional considerations when running jobs on a
294
294
295
295
### Including Your Dependencies
296
296
If your code depends on other projects, you will need to ensure they are also
297
- present on the slave nodes. The most common way to do this is to create an
298
- assembly jar (or "uber" jar) containing your code and its dependencies. You
299
- may then submit the assembly jar when creating a SparkContext object. If you
300
- do this, you should make Spark itself a ` provided ` dependency, since it will
301
- already be present on the slave nodes. It is also possible to submit your
302
- dependent jars one-by-one when creating a SparkContext.
297
+ present on the slave nodes. A popular approach is to create an
298
+ assembly jar (or "uber" jar) containing your code and its dependencies. Both
299
+ [ sbt] ( https://github.com/sbt/sbt-assembly ) and
300
+ [ Maven] ( http://maven.apache.org/plugins/maven-assembly-plugin/ )
301
+ have assembly plugins. When creating assembly jars, list Spark
302
+ itself as a ` provided ` dependency; it need not be bundled since it is
303
+ already present on the slaves. Once you have an assembled jar,
304
+ add it to the SparkContext as shown here. It is also possible to submit
305
+ your dependent jars one-by-one when creating a SparkContext.
303
306
304
307
### Setting Configuration Options
305
308
Spark includes several configuration options which influence the behavior
You can’t perform that action at this time.
0 commit comments