Need help with docker-zeppelin?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

dylanmei
215 Stars 125 Forks 80 Commits 14 Opened issues

Description

Docker build for Zeppelin, a web-based Spark notebook

Services available

!
?

Need anything else?

Contributors list

# 83,560
Scala
syslog
kafka
Clojure
62 commits
# 244,689
Shell
PHP
Go
secure-...
3 commits
# 71,693
Java
quartz
HTML
Shell
2 commits
# 195,474
HTML
Groovy
jvm
junit
2 commits
# 23,512
JavaScr...
fake-co...
HTML
faker-g...
1 commit

This repo is DEPRECATED, Please refer to Apache Zeppelin Official Docker Image

zeppelin

A

debian:jessie
based Spark and Zeppelin Docker container.

This image is large and opinionated. It contains:

A prior build of

dylanmei/zeppelin:latest
contained Spark 1.6.0, Python 2.7, and all of the stock interpreters. That image is still available as
dylanmei/zeppelin:0.6.0-stable
.

simple usage

To start Zeppelin pull the

latest
image and run the container:
docker pull dylanmei/zeppelin
docker run --rm -p 8080:8080 dylanmei/zeppelin

Zeppelin will be running at

http://${YOUR_DOCKER_HOST}:8080
.

complex usage

You can use docker-compose to easily run Zeppelin in more complex configurations. See this project's

./examples
directory for examples of using Zeppelin with
docker-compose
to :
  • read and write from local data files
  • read and write documents in ElasticSearch

onbuild

The Docker

onbuild
container is still a part of this project, but I have no plans to keep it updated. See the
onbuild
directory to view its
Dockerfile
.

To use it, create a new

Dockerfile
based on
dylanmei/zeppelin:onbuild
and supply a new, executable
install.sh
file in the same directory. It will override the base one via Docker's ONBUILD instruction.

The steps, expressed here as a script, can be as simple as:

#!/bin/bash
cat > ./Dockerfile < ./install.sh <

license

MIT

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.