hadoop - How to specify username when putting files on HDFS from a remote machine -


i have hadoop cluster setup , working under common default username "user1". want put files hadoop remote machine not part of hadoop cluster. configured hadoop files on remote machine in way when

hadoop dfs -put file1 ... 

is called remote machine, puts file1 on hadoop cluster.

the problem logged in "user2" on remote machine , doesn't give me result expect. in fact, above code can executed on remote machine as:

hadoop dfs -put file1 /user/user2/testfolder 

however, want able store file as:

hadoop dfs -put file1 /user/user1/testfolder 

if try run last code, hadoop throws error because of access permissions. there anyway can specify username within hadoop dfs command?

i looking like:

hadoop dfs -username user1 file1 /user/user1/testfolder 

thanks

by default authentication , authorization turned off in hadoop. according hadoop - definitive guide (btw, nice book - recommend buy it)

the user identity hadoop uses permissions in hdfs determined running whoami command on client system. similarly, group names derived output of running groups.

so, can create new whoami command returns required username , put in path appropriately, created whoami found before actual whoami comes linux found. similarly, can play groups command also.

this hack , won't work once authentication , authorization has been turned on.


Comments

Popular posts from this blog

javascript - backbone.js Collection.add() doesn't `construct` (`initialize`) an object -

php - Get uncommon values from two or more arrays -

Adding duplicate array rows in Php -