Apache Spark™ and Scala Workshops
If you answered YES to any of the questions above, I have good news for you! Join one of the following Apache Spark™ workshops and become a Apache Spark™ pro.
You can find the slides for the above workshops and others at Apache Spark Workshops and Webinars page.
No prior experience with Apache Spark or Scala required.
CAUTION: The workshops are very hands-on and practical, and certainly not for faint-hearted. Seriously! After 5 days your mind, eyes, and hands will all be trained to recognize the patterns where and how to use Spark and Scala in your Big Data projects.
git clonethe project first and execute
sbt testin the cloned project's directory.
$ sbt test ... [info] All tests passed. [success] Total time: 3 s, completed Mar 10, 2016 10:37:26 PM
You should see
[info] All tests passed.to consider yourself prepared.
Execute the following command to have a complete Docker image for the workshop.
NOTE: It was tested on Mac OS only. I assume that
-vin the command will not work on Windows and need to be changed to appropriate environment settings.
docker run -ti -p 4040:4040 -p 8080:8080 -v "$PWD:/home/spark/workspace" -v "$HOME/.ivy2":/home/spark/.ivy2 -h spark --name=spark jaceklaskowski/docker-spark