Docker is a popular container runtime. There are official Docker images for Apache Flink available on Docker Hub which can be used directly or extended to better integrate into a production environment.
The official Docker repository is hosted on Docker Hub and serves images of Flink version 1.2.1 and later.
Images for each supported combination of Hadoop and Scala are available, and tag aliases are provided for convenience.
For example, the following aliases can be used: (1.2.y
indicates the latest
release of Flink 1.2)
flink:latest
→
flink:<latest-flink>-hadoop<latest-hadoop>-scala_<latest-scala>
flink:1.2
→ flink:1.2.y-hadoop27-scala_2.11
flink:1.2.1-scala_2.10
→ flink:1.2.1-hadoop27-scala_2.10
flink:1.2-hadoop26
→ flink:1.2.y-hadoop26-scala_2.11
Note: The docker images are provided as a community project by individuals on a best-effort basis. They are not official releases by the Apache Flink PMC.
Docker Compose is a convenient way to run a group of Docker containers locally.
An example config file is available on GitHub.
Launch a cluster in the foreground
docker-compose up
Launch a cluster in the background
docker-compose up -d
Scale the cluster up or down to N TaskManagers
docker-compose scale taskmanager=<N>
When the cluster is running, you can visit the web UI at http://localhost:8081 and submit a job.
To submit a job via the command line, you must copy the JAR to the Jobmanager container and submit the job from there.
For example:
$ JOBMANAGER_CONTAINER=$(docker ps --filter name=jobmanager --format={{.ID}})
$ docker cp path/to/jar "$JOBMANAGER_CONTAINER":/job.jar
$ docker exec -t -i "$JOBMANAGER_CONTAINER" flink run /job.jar