hadoop/hadoop-ozone/dist/src/main/compose
Anu Engineer 34a914be03 HDDS-846. Exports ozone metrics to prometheus.
Contributed by Elek, Marton.
2018-11-27 15:57:24 -08:00
..
ozone HDDS-665. Add hdds.datanode.dir to docker-config. Contributed by Bharat Viswanadham. 2018-10-16 15:29:53 -07:00
ozone-hdfs HDDS-223. Create acceptance test for using datanode plugin. Contributed by Sandeep Nemuri. 2018-11-15 15:18:07 +01:00
ozonefs HDDS-651. Rename o3 to o3fs for Filesystem. 2018-10-17 14:19:17 -07:00
ozoneperf HDDS-846. Exports ozone metrics to prometheus. 2018-11-27 15:57:24 -08:00
ozones3 HDDS-665. Add hdds.datanode.dir to docker-config. Contributed by Bharat Viswanadham. 2018-10-16 15:29:53 -07:00
ozonescripts HDDS-665. Add hdds.datanode.dir to docker-config. Contributed by Bharat Viswanadham. 2018-10-16 15:29:53 -07:00
README.md HDDS-447. Separate ozone-dist and hadoop-dist projects with real classpath separation. Contributed by Elek Marton. 2018-09-24 10:10:11 -07:00

Docker cluster definitions

This directory contains multiple docker cluster definitions to start local pseudo cluster with different configuration.

It helps to start local (multi-node like) pseudo cluster with docker and docker-compose and obviously it's not for production.

You may find more information in the specific subdirectories but in generic you can use the following commands:

Usage

To start a cluster go to a subdirectory and start the cluster:

docker-compose up -d

You can check the logs of all the components with:

docker-compose logs

In case of a problem you can destroy the cluster an delete all the local state with:

docker-compose down

(Note: a simple docker-compose stop may not delete all the local data).

You can scale up and down the components:

docker-compose scale datanode=5

Usually the key webui ports are published on the docker host.