hadoop - NativeIO chmod "ENOTDIR" exception in mapreduce -


i'm finding mapreduce job cant start due funniness in rawlocalfilesystem, appears.

how can debug error ? appears there no trace of directory or command associated nativeio chmod exception.

one option of course bundle jar classpath custom rawlocalfilesystem implementation, seems overkill.

13/07/11 18:39:43 error security.usergroupinformation: priviledgedactionexception as:root cause:enotdir: not directory enotdir: not directory @ org.apache.hadoop.io.nativeio.nativeio.chmod(native method) @ org.apache.hadoop.fs.fileutil.execsetpermission(fileutil.java:699) @ org.apache.hadoop.fs.fileutil.setpermission(fileutil.java:654) @ org.apache.hadoop.fs.rawlocalfilesystem.setpermission(rawlocalfilesystem.java:509) @ org.apache.hadoop.fs.rawlocalfilesystem.mkdirs(rawlocalfilesystem.java:344) @ org.apache.hadoop.fs.filterfilesystem.mkdirs(filterfilesystem.java:189) @ org.apache.hadoop.mapreduce.jobsubmissionfiles.getstagingdir(jobsubmissionfiles.java:116)

this interesting bug: turned out indeed had file created existed, already, in place directory needed created.

that , how, had file called "tmp" in file system implementation's root directory !

in case, confusion due scarce error reporting hadoop nativeio class.

i think failure should reported , logged better underlying nativeio class .


Comments