51c64b357d
This uses the length of the file known at the start of the copy to determine the amount of data to copy. * If a file is appended to during the copy, the original bytes are copied. * If a file is truncated during a copy, or the attempt to read the data fails with a truncated stream, distcp will now fail. Until now these failures were not detected. Contributed by Mukund Thakur. Change-Id: I576a49d951fa48d37a45a7e4c82c47488aa8e884 |
||
---|---|---|
.github | ||
dev-support | ||
hadoop-assemblies | ||
hadoop-build-tools | ||
hadoop-client-modules | ||
hadoop-cloud-storage-project | ||
hadoop-common-project | ||
hadoop-dist | ||
hadoop-hdds | ||
hadoop-hdfs-project | ||
hadoop-mapreduce-project | ||
hadoop-maven-plugins | ||
hadoop-minicluster | ||
hadoop-ozone | ||
hadoop-project | ||
hadoop-project-dist | ||
hadoop-submarine | ||
hadoop-tools | ||
hadoop-yarn-project | ||
licenses | ||
licenses-binary | ||
.gitattributes | ||
.gitignore | ||
BUILDING.txt | ||
Jenkinsfile | ||
LICENSE-binary | ||
LICENSE.txt | ||
NOTICE-binary | ||
NOTICE.txt | ||
pom.ozone.xml | ||
pom.xml | ||
README.txt | ||
start-build-env.sh |
For the latest information about Hadoop, please visit our website at: http://hadoop.apache.org/ and our wiki, at: https://cwiki.apache.org/confluence/display/HADOOP/