hadoop/hadoop-hdfs-project/hadoop-hdfs-httpfs
Steve Loughran e346e3638c HADOOP-15691 Add PathCapabilities to FileSystem and FileContext.
Contributed by Steve Loughran.

This complements the StreamCapabilities Interface by allowing applications to probe for a specific path on a specific instance of a FileSystem client
to offer a specific capability.

This is intended to allow applications to determine

* Whether a method is implemented before calling it and dealing with UnsupportedOperationException.
* Whether a specific feature is believed to be available in the remote store.

As well as a common set of capabilities defined in CommonPathCapabilities,
file systems are free to add their own capabilities, prefixed with
 fs. + schema + .

The plan is to identify and document more capabilities -and for file systems which add new features, for a declaration of the availability of the feature to always be available.

Note

* The remote store is not expected to be checked for the feature;
  It is more a check of client API and the client's configuration/knowledge
  of the state of the remote system.
* Permissions are not checked.

Change-Id: I80bfebe94f4a8bdad8f3ac055495735b824968f5
2019-09-25 12:16:41 +01:00
..
dev-support HDFS-3513. HttpFS should cache filesystems. (tucu) 2012-08-01 23:09:19 +00:00
src HADOOP-15691 Add PathCapabilities to FileSystem and FileContext. 2019-09-25 12:16:41 +01:00
pom.xml HDFS-13654. Use a random secret when a secret file doesn't exist in HttpFS. This should be default. 2019-05-31 10:29:24 +09:00
README.txt HDFS-2178. Contributing Hoop to HDFS, replacement for HDFS proxy with read/write capabilities. (tucu) 2011-12-08 19:25:28 +00:00

-----------------------------------------------------------------------------
HttpFS - Hadoop HDFS over HTTP

HttpFS is a server that provides a REST HTTP gateway to HDFS with full
filesystem read & write capabilities.

HttpFS can be used to transfer data between clusters running different
versions of Hadoop (overcoming RPC versioning issues), for example using
Hadoop DistCP.

HttpFS can be used to access data in HDFS on a cluster behind of a firewall
(the HttpFS server acts as a gateway and is the only system that is allowed
to cross the firewall into the cluster).

HttpFS can be used to access data in HDFS using HTTP utilities (such as curl
and wget) and HTTP libraries Perl from other languages than Java.
-----------------------------------------------------------------------------