This repo has two different resources.
- testing of spark docker containers, orchestrated via vagrant.
- spark streaming blueprint applications.
If interested just in the docker containers, checkout the deploy/
directory.
These are mostly unrelated. They just happen to be in the same repo.
To use it, import it into INtelliJ or your favorite IDE as an SBT project.
Then you should be able to run standard SBT tests/compile tasks etc inside your idea and also in standalone mode.
-
git clone jayunit100/SparkBluePrint
-
remove folders that dont apply to your project.
-
Now open intellij, and import.
-
Pick "SBT" project as the template
-
Run the Tester class via intelliJ
-
Change the .git/config to point to your repository.
-
Set up a spark cluster w/ cassandra slaves. There is a WIP project under deploy/ which sets scaffolding for this up using dockerfiles and vagrant to create a n-node spark cluster w/ a cassandra sink.
-
Then run
sbt package
, to create the jar. -
copy the jar into the shared directory defined in the vagrantfile in deploy (in general, just copy the file into your machine that submits the spark jobs).
-
spark-submit the application jar w/ desired class name (details coming soon).
Feedback welcome !